Jan 30 03:24:56 np0005601978 kernel: Linux version 5.14.0-665.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026
Jan 30 03:24:56 np0005601978 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 30 03:24:56 np0005601978 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 30 03:24:56 np0005601978 kernel: BIOS-provided physical RAM map:
Jan 30 03:24:56 np0005601978 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 30 03:24:56 np0005601978 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 30 03:24:56 np0005601978 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 30 03:24:56 np0005601978 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 30 03:24:56 np0005601978 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 30 03:24:56 np0005601978 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 30 03:24:56 np0005601978 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 30 03:24:56 np0005601978 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 30 03:24:56 np0005601978 kernel: NX (Execute Disable) protection: active
Jan 30 03:24:56 np0005601978 kernel: APIC: Static calls initialized
Jan 30 03:24:56 np0005601978 kernel: SMBIOS 2.8 present.
Jan 30 03:24:56 np0005601978 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 30 03:24:56 np0005601978 kernel: Hypervisor detected: KVM
Jan 30 03:24:56 np0005601978 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 30 03:24:56 np0005601978 kernel: kvm-clock: using sched offset of 4177259280 cycles
Jan 30 03:24:56 np0005601978 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 30 03:24:56 np0005601978 kernel: tsc: Detected 2799.998 MHz processor
Jan 30 03:24:56 np0005601978 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 30 03:24:56 np0005601978 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 30 03:24:56 np0005601978 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 30 03:24:56 np0005601978 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 30 03:24:56 np0005601978 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 30 03:24:56 np0005601978 kernel: Using GB pages for direct mapping
Jan 30 03:24:56 np0005601978 kernel: RAMDISK: [mem 0x2d410000-0x329fffff]
Jan 30 03:24:56 np0005601978 kernel: ACPI: Early table checksum verification disabled
Jan 30 03:24:56 np0005601978 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 30 03:24:56 np0005601978 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 30 03:24:56 np0005601978 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 30 03:24:56 np0005601978 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 30 03:24:56 np0005601978 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 30 03:24:56 np0005601978 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 30 03:24:56 np0005601978 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 30 03:24:56 np0005601978 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 30 03:24:56 np0005601978 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 30 03:24:56 np0005601978 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 30 03:24:56 np0005601978 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 30 03:24:56 np0005601978 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 30 03:24:56 np0005601978 kernel: No NUMA configuration found
Jan 30 03:24:56 np0005601978 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 30 03:24:56 np0005601978 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Jan 30 03:24:56 np0005601978 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 30 03:24:56 np0005601978 kernel: Zone ranges:
Jan 30 03:24:56 np0005601978 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 30 03:24:56 np0005601978 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 30 03:24:56 np0005601978 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 30 03:24:56 np0005601978 kernel:  Device   empty
Jan 30 03:24:56 np0005601978 kernel: Movable zone start for each node
Jan 30 03:24:56 np0005601978 kernel: Early memory node ranges
Jan 30 03:24:56 np0005601978 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 30 03:24:56 np0005601978 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 30 03:24:56 np0005601978 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 30 03:24:56 np0005601978 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 30 03:24:56 np0005601978 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 30 03:24:56 np0005601978 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 30 03:24:56 np0005601978 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 30 03:24:56 np0005601978 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 30 03:24:56 np0005601978 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 30 03:24:56 np0005601978 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 30 03:24:56 np0005601978 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 30 03:24:56 np0005601978 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 30 03:24:56 np0005601978 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 30 03:24:56 np0005601978 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 30 03:24:56 np0005601978 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 30 03:24:56 np0005601978 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 30 03:24:56 np0005601978 kernel: TSC deadline timer available
Jan 30 03:24:56 np0005601978 kernel: CPU topo: Max. logical packages:   8
Jan 30 03:24:56 np0005601978 kernel: CPU topo: Max. logical dies:       8
Jan 30 03:24:56 np0005601978 kernel: CPU topo: Max. dies per package:   1
Jan 30 03:24:56 np0005601978 kernel: CPU topo: Max. threads per core:   1
Jan 30 03:24:56 np0005601978 kernel: CPU topo: Num. cores per package:     1
Jan 30 03:24:56 np0005601978 kernel: CPU topo: Num. threads per package:   1
Jan 30 03:24:56 np0005601978 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 30 03:24:56 np0005601978 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 30 03:24:56 np0005601978 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 30 03:24:56 np0005601978 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 30 03:24:56 np0005601978 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 30 03:24:56 np0005601978 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 30 03:24:56 np0005601978 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 30 03:24:56 np0005601978 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 30 03:24:56 np0005601978 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 30 03:24:56 np0005601978 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 30 03:24:56 np0005601978 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 30 03:24:56 np0005601978 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 30 03:24:56 np0005601978 kernel: Booting paravirtualized kernel on KVM
Jan 30 03:24:56 np0005601978 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 30 03:24:56 np0005601978 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 30 03:24:56 np0005601978 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 30 03:24:56 np0005601978 kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 30 03:24:56 np0005601978 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 30 03:24:56 np0005601978 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64", will be passed to user space.
Jan 30 03:24:56 np0005601978 kernel: random: crng init done
Jan 30 03:24:56 np0005601978 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 30 03:24:56 np0005601978 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 30 03:24:56 np0005601978 kernel: Fallback order for Node 0: 0 
Jan 30 03:24:56 np0005601978 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 30 03:24:56 np0005601978 kernel: Policy zone: Normal
Jan 30 03:24:56 np0005601978 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 30 03:24:56 np0005601978 kernel: software IO TLB: area num 8.
Jan 30 03:24:56 np0005601978 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 30 03:24:56 np0005601978 kernel: ftrace: allocating 49438 entries in 194 pages
Jan 30 03:24:56 np0005601978 kernel: ftrace: allocated 194 pages with 3 groups
Jan 30 03:24:56 np0005601978 kernel: Dynamic Preempt: voluntary
Jan 30 03:24:56 np0005601978 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 30 03:24:56 np0005601978 kernel: rcu: #011RCU event tracing is enabled.
Jan 30 03:24:56 np0005601978 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 30 03:24:56 np0005601978 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 30 03:24:56 np0005601978 kernel: #011Rude variant of Tasks RCU enabled.
Jan 30 03:24:56 np0005601978 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 30 03:24:56 np0005601978 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 30 03:24:56 np0005601978 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 30 03:24:56 np0005601978 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 30 03:24:56 np0005601978 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 30 03:24:56 np0005601978 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 30 03:24:56 np0005601978 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 30 03:24:56 np0005601978 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 30 03:24:56 np0005601978 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 30 03:24:56 np0005601978 kernel: Console: colour VGA+ 80x25
Jan 30 03:24:56 np0005601978 kernel: printk: console [ttyS0] enabled
Jan 30 03:24:56 np0005601978 kernel: ACPI: Core revision 20230331
Jan 30 03:24:56 np0005601978 kernel: APIC: Switch to symmetric I/O mode setup
Jan 30 03:24:56 np0005601978 kernel: x2apic enabled
Jan 30 03:24:56 np0005601978 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 30 03:24:56 np0005601978 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 30 03:24:56 np0005601978 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Jan 30 03:24:56 np0005601978 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 30 03:24:56 np0005601978 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 30 03:24:56 np0005601978 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 30 03:24:56 np0005601978 kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Jan 30 03:24:56 np0005601978 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 30 03:24:56 np0005601978 kernel: Spectre V2 : Mitigation: Retpolines
Jan 30 03:24:56 np0005601978 kernel: RETBleed: Mitigation: untrained return thunk
Jan 30 03:24:56 np0005601978 kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Jan 30 03:24:56 np0005601978 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 30 03:24:56 np0005601978 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 30 03:24:56 np0005601978 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 30 03:24:56 np0005601978 kernel: active return thunk: retbleed_return_thunk
Jan 30 03:24:56 np0005601978 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 30 03:24:56 np0005601978 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 30 03:24:56 np0005601978 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 30 03:24:56 np0005601978 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 30 03:24:56 np0005601978 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 30 03:24:56 np0005601978 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 30 03:24:56 np0005601978 kernel: Freeing SMP alternatives memory: 40K
Jan 30 03:24:56 np0005601978 kernel: pid_max: default: 32768 minimum: 301
Jan 30 03:24:56 np0005601978 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 30 03:24:56 np0005601978 kernel: landlock: Up and running.
Jan 30 03:24:56 np0005601978 kernel: Yama: becoming mindful.
Jan 30 03:24:56 np0005601978 kernel: SELinux:  Initializing.
Jan 30 03:24:56 np0005601978 kernel: LSM support for eBPF active
Jan 30 03:24:56 np0005601978 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 30 03:24:56 np0005601978 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 30 03:24:56 np0005601978 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 30 03:24:56 np0005601978 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 30 03:24:56 np0005601978 kernel: ... version:                0
Jan 30 03:24:56 np0005601978 kernel: ... bit width:              48
Jan 30 03:24:56 np0005601978 kernel: ... generic registers:      6
Jan 30 03:24:56 np0005601978 kernel: ... value mask:             0000ffffffffffff
Jan 30 03:24:56 np0005601978 kernel: ... max period:             00007fffffffffff
Jan 30 03:24:56 np0005601978 kernel: ... fixed-purpose events:   0
Jan 30 03:24:56 np0005601978 kernel: ... event mask:             000000000000003f
Jan 30 03:24:56 np0005601978 kernel: signal: max sigframe size: 1776
Jan 30 03:24:56 np0005601978 kernel: rcu: Hierarchical SRCU implementation.
Jan 30 03:24:56 np0005601978 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 30 03:24:56 np0005601978 kernel: smp: Bringing up secondary CPUs ...
Jan 30 03:24:56 np0005601978 kernel: smpboot: x86: Booting SMP configuration:
Jan 30 03:24:56 np0005601978 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 30 03:24:56 np0005601978 kernel: smp: Brought up 1 node, 8 CPUs
Jan 30 03:24:56 np0005601978 kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Jan 30 03:24:56 np0005601978 kernel: node 0 deferred pages initialised in 10ms
Jan 30 03:24:56 np0005601978 kernel: Memory: 7763724K/8388068K available (16384K kernel code, 5801K rwdata, 13928K rodata, 4196K init, 7192K bss, 618408K reserved, 0K cma-reserved)
Jan 30 03:24:56 np0005601978 kernel: devtmpfs: initialized
Jan 30 03:24:56 np0005601978 kernel: x86/mm: Memory block size: 128MB
Jan 30 03:24:56 np0005601978 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 30 03:24:56 np0005601978 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 30 03:24:56 np0005601978 kernel: pinctrl core: initialized pinctrl subsystem
Jan 30 03:24:56 np0005601978 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 30 03:24:56 np0005601978 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 30 03:24:56 np0005601978 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 30 03:24:56 np0005601978 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 30 03:24:56 np0005601978 kernel: audit: initializing netlink subsys (disabled)
Jan 30 03:24:56 np0005601978 kernel: audit: type=2000 audit(1769761495.486:1): state=initialized audit_enabled=0 res=1
Jan 30 03:24:56 np0005601978 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 30 03:24:56 np0005601978 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 30 03:24:56 np0005601978 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 30 03:24:56 np0005601978 kernel: cpuidle: using governor menu
Jan 30 03:24:56 np0005601978 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 30 03:24:56 np0005601978 kernel: PCI: Using configuration type 1 for base access
Jan 30 03:24:56 np0005601978 kernel: PCI: Using configuration type 1 for extended access
Jan 30 03:24:56 np0005601978 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 30 03:24:56 np0005601978 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 30 03:24:56 np0005601978 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 30 03:24:56 np0005601978 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 30 03:24:56 np0005601978 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 30 03:24:56 np0005601978 kernel: Demotion targets for Node 0: null
Jan 30 03:24:56 np0005601978 kernel: cryptd: max_cpu_qlen set to 1000
Jan 30 03:24:56 np0005601978 kernel: ACPI: Added _OSI(Module Device)
Jan 30 03:24:56 np0005601978 kernel: ACPI: Added _OSI(Processor Device)
Jan 30 03:24:56 np0005601978 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 30 03:24:56 np0005601978 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 30 03:24:56 np0005601978 kernel: ACPI: Interpreter enabled
Jan 30 03:24:56 np0005601978 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 30 03:24:56 np0005601978 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 30 03:24:56 np0005601978 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 30 03:24:56 np0005601978 kernel: PCI: Using E820 reservations for host bridge windows
Jan 30 03:24:56 np0005601978 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 30 03:24:56 np0005601978 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 30 03:24:56 np0005601978 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [3] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [4] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [5] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [6] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [7] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [8] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [9] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [10] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [11] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [12] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [13] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [14] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [15] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [16] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [17] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [18] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [19] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [20] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [21] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [22] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [23] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [24] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [25] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [26] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [27] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [28] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [29] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [30] registered
Jan 30 03:24:56 np0005601978 kernel: acpiphp: Slot [31] registered
Jan 30 03:24:56 np0005601978 kernel: PCI host bridge to bus 0000:00
Jan 30 03:24:56 np0005601978 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 30 03:24:56 np0005601978 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 30 03:24:56 np0005601978 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 30 03:24:56 np0005601978 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 30 03:24:56 np0005601978 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 30 03:24:56 np0005601978 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 30 03:24:56 np0005601978 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 30 03:24:56 np0005601978 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 30 03:24:56 np0005601978 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 30 03:24:56 np0005601978 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 30 03:24:56 np0005601978 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 30 03:24:56 np0005601978 kernel: iommu: Default domain type: Translated
Jan 30 03:24:56 np0005601978 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 30 03:24:56 np0005601978 kernel: SCSI subsystem initialized
Jan 30 03:24:56 np0005601978 kernel: ACPI: bus type USB registered
Jan 30 03:24:56 np0005601978 kernel: usbcore: registered new interface driver usbfs
Jan 30 03:24:56 np0005601978 kernel: usbcore: registered new interface driver hub
Jan 30 03:24:56 np0005601978 kernel: usbcore: registered new device driver usb
Jan 30 03:24:56 np0005601978 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 30 03:24:56 np0005601978 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 30 03:24:56 np0005601978 kernel: PTP clock support registered
Jan 30 03:24:56 np0005601978 kernel: EDAC MC: Ver: 3.0.0
Jan 30 03:24:56 np0005601978 kernel: NetLabel: Initializing
Jan 30 03:24:56 np0005601978 kernel: NetLabel:  domain hash size = 128
Jan 30 03:24:56 np0005601978 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 30 03:24:56 np0005601978 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 30 03:24:56 np0005601978 kernel: PCI: Using ACPI for IRQ routing
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 30 03:24:56 np0005601978 kernel: vgaarb: loaded
Jan 30 03:24:56 np0005601978 kernel: clocksource: Switched to clocksource kvm-clock
Jan 30 03:24:56 np0005601978 kernel: VFS: Disk quotas dquot_6.6.0
Jan 30 03:24:56 np0005601978 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 30 03:24:56 np0005601978 kernel: pnp: PnP ACPI init
Jan 30 03:24:56 np0005601978 kernel: pnp: PnP ACPI: found 5 devices
Jan 30 03:24:56 np0005601978 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 30 03:24:56 np0005601978 kernel: NET: Registered PF_INET protocol family
Jan 30 03:24:56 np0005601978 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 30 03:24:56 np0005601978 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 30 03:24:56 np0005601978 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 30 03:24:56 np0005601978 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 30 03:24:56 np0005601978 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 30 03:24:56 np0005601978 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 30 03:24:56 np0005601978 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 30 03:24:56 np0005601978 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 30 03:24:56 np0005601978 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 30 03:24:56 np0005601978 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 30 03:24:56 np0005601978 kernel: NET: Registered PF_XDP protocol family
Jan 30 03:24:56 np0005601978 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 30 03:24:56 np0005601978 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 30 03:24:56 np0005601978 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 30 03:24:56 np0005601978 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 30 03:24:56 np0005601978 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 30 03:24:56 np0005601978 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 30 03:24:56 np0005601978 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 30751 usecs
Jan 30 03:24:56 np0005601978 kernel: PCI: CLS 0 bytes, default 64
Jan 30 03:24:56 np0005601978 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 30 03:24:56 np0005601978 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 30 03:24:56 np0005601978 kernel: Trying to unpack rootfs image as initramfs...
Jan 30 03:24:56 np0005601978 kernel: ACPI: bus type thunderbolt registered
Jan 30 03:24:56 np0005601978 kernel: Initialise system trusted keyrings
Jan 30 03:24:56 np0005601978 kernel: Key type blacklist registered
Jan 30 03:24:56 np0005601978 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 30 03:24:56 np0005601978 kernel: zbud: loaded
Jan 30 03:24:56 np0005601978 kernel: integrity: Platform Keyring initialized
Jan 30 03:24:56 np0005601978 kernel: integrity: Machine keyring initialized
Jan 30 03:24:56 np0005601978 kernel: Freeing initrd memory: 88000K
Jan 30 03:24:56 np0005601978 kernel: NET: Registered PF_ALG protocol family
Jan 30 03:24:56 np0005601978 kernel: xor: automatically using best checksumming function   avx       
Jan 30 03:24:56 np0005601978 kernel: Key type asymmetric registered
Jan 30 03:24:56 np0005601978 kernel: Asymmetric key parser 'x509' registered
Jan 30 03:24:56 np0005601978 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 30 03:24:56 np0005601978 kernel: io scheduler mq-deadline registered
Jan 30 03:24:56 np0005601978 kernel: io scheduler kyber registered
Jan 30 03:24:56 np0005601978 kernel: io scheduler bfq registered
Jan 30 03:24:56 np0005601978 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 30 03:24:56 np0005601978 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 30 03:24:56 np0005601978 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 30 03:24:56 np0005601978 kernel: ACPI: button: Power Button [PWRF]
Jan 30 03:24:56 np0005601978 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 30 03:24:56 np0005601978 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 30 03:24:56 np0005601978 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 30 03:24:56 np0005601978 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 30 03:24:56 np0005601978 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 30 03:24:56 np0005601978 kernel: Non-volatile memory driver v1.3
Jan 30 03:24:56 np0005601978 kernel: rdac: device handler registered
Jan 30 03:24:56 np0005601978 kernel: hp_sw: device handler registered
Jan 30 03:24:56 np0005601978 kernel: emc: device handler registered
Jan 30 03:24:56 np0005601978 kernel: alua: device handler registered
Jan 30 03:24:56 np0005601978 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 30 03:24:56 np0005601978 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 30 03:24:56 np0005601978 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 30 03:24:56 np0005601978 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 30 03:24:56 np0005601978 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 30 03:24:56 np0005601978 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 30 03:24:56 np0005601978 kernel: usb usb1: Product: UHCI Host Controller
Jan 30 03:24:56 np0005601978 kernel: usb usb1: Manufacturer: Linux 5.14.0-665.el9.x86_64 uhci_hcd
Jan 30 03:24:56 np0005601978 kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 30 03:24:56 np0005601978 kernel: hub 1-0:1.0: USB hub found
Jan 30 03:24:56 np0005601978 kernel: hub 1-0:1.0: 2 ports detected
Jan 30 03:24:56 np0005601978 kernel: usbcore: registered new interface driver usbserial_generic
Jan 30 03:24:56 np0005601978 kernel: usbserial: USB Serial support registered for generic
Jan 30 03:24:56 np0005601978 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 30 03:24:56 np0005601978 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 30 03:24:56 np0005601978 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 30 03:24:56 np0005601978 kernel: mousedev: PS/2 mouse device common for all mice
Jan 30 03:24:56 np0005601978 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 30 03:24:56 np0005601978 kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 30 03:24:56 np0005601978 kernel: rtc_cmos 00:04: registered as rtc0
Jan 30 03:24:56 np0005601978 kernel: rtc_cmos 00:04: setting system clock to 2026-01-30T08:24:55 UTC (1769761495)
Jan 30 03:24:56 np0005601978 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 30 03:24:56 np0005601978 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 30 03:24:56 np0005601978 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 30 03:24:56 np0005601978 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 30 03:24:56 np0005601978 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 30 03:24:56 np0005601978 kernel: usbcore: registered new interface driver usbhid
Jan 30 03:24:56 np0005601978 kernel: usbhid: USB HID core driver
Jan 30 03:24:56 np0005601978 kernel: drop_monitor: Initializing network drop monitor service
Jan 30 03:24:56 np0005601978 kernel: Initializing XFRM netlink socket
Jan 30 03:24:56 np0005601978 kernel: NET: Registered PF_INET6 protocol family
Jan 30 03:24:56 np0005601978 kernel: Segment Routing with IPv6
Jan 30 03:24:56 np0005601978 kernel: NET: Registered PF_PACKET protocol family
Jan 30 03:24:56 np0005601978 kernel: mpls_gso: MPLS GSO support
Jan 30 03:24:56 np0005601978 kernel: IPI shorthand broadcast: enabled
Jan 30 03:24:56 np0005601978 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 30 03:24:56 np0005601978 kernel: AES CTR mode by8 optimization enabled
Jan 30 03:24:56 np0005601978 kernel: sched_clock: Marking stable (1074002331, 149645607)->(1292057503, -68409565)
Jan 30 03:24:56 np0005601978 kernel: registered taskstats version 1
Jan 30 03:24:56 np0005601978 kernel: Loading compiled-in X.509 certificates
Jan 30 03:24:56 np0005601978 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Jan 30 03:24:56 np0005601978 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 30 03:24:56 np0005601978 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 30 03:24:56 np0005601978 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 30 03:24:56 np0005601978 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 30 03:24:56 np0005601978 kernel: Demotion targets for Node 0: null
Jan 30 03:24:56 np0005601978 kernel: page_owner is disabled
Jan 30 03:24:56 np0005601978 kernel: Key type .fscrypt registered
Jan 30 03:24:56 np0005601978 kernel: Key type fscrypt-provisioning registered
Jan 30 03:24:56 np0005601978 kernel: Key type big_key registered
Jan 30 03:24:56 np0005601978 kernel: Key type encrypted registered
Jan 30 03:24:56 np0005601978 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 30 03:24:56 np0005601978 kernel: Loading compiled-in module X.509 certificates
Jan 30 03:24:56 np0005601978 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Jan 30 03:24:56 np0005601978 kernel: ima: Allocated hash algorithm: sha256
Jan 30 03:24:56 np0005601978 kernel: ima: No architecture policies found
Jan 30 03:24:56 np0005601978 kernel: evm: Initialising EVM extended attributes:
Jan 30 03:24:56 np0005601978 kernel: evm: security.selinux
Jan 30 03:24:56 np0005601978 kernel: evm: security.SMACK64 (disabled)
Jan 30 03:24:56 np0005601978 kernel: evm: security.SMACK64EXEC (disabled)
Jan 30 03:24:56 np0005601978 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 30 03:24:56 np0005601978 kernel: evm: security.SMACK64MMAP (disabled)
Jan 30 03:24:56 np0005601978 kernel: evm: security.apparmor (disabled)
Jan 30 03:24:56 np0005601978 kernel: evm: security.ima
Jan 30 03:24:56 np0005601978 kernel: evm: security.capability
Jan 30 03:24:56 np0005601978 kernel: evm: HMAC attrs: 0x1
Jan 30 03:24:56 np0005601978 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 30 03:24:56 np0005601978 kernel: Running certificate verification RSA selftest
Jan 30 03:24:56 np0005601978 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 30 03:24:56 np0005601978 kernel: Running certificate verification ECDSA selftest
Jan 30 03:24:56 np0005601978 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 30 03:24:56 np0005601978 kernel: clk: Disabling unused clocks
Jan 30 03:24:56 np0005601978 kernel: Freeing unused decrypted memory: 2028K
Jan 30 03:24:56 np0005601978 kernel: Freeing unused kernel image (initmem) memory: 4196K
Jan 30 03:24:56 np0005601978 kernel: Write protecting the kernel read-only data: 30720k
Jan 30 03:24:56 np0005601978 kernel: Freeing unused kernel image (rodata/data gap) memory: 408K
Jan 30 03:24:56 np0005601978 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 30 03:24:56 np0005601978 kernel: Run /init as init process
Jan 30 03:24:56 np0005601978 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 30 03:24:56 np0005601978 systemd: Detected virtualization kvm.
Jan 30 03:24:56 np0005601978 systemd: Detected architecture x86-64.
Jan 30 03:24:56 np0005601978 systemd: Running in initrd.
Jan 30 03:24:56 np0005601978 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 30 03:24:56 np0005601978 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 30 03:24:56 np0005601978 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 30 03:24:56 np0005601978 kernel: usb 1-1: Manufacturer: QEMU
Jan 30 03:24:56 np0005601978 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 30 03:24:56 np0005601978 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 30 03:24:56 np0005601978 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 30 03:24:56 np0005601978 systemd: No hostname configured, using default hostname.
Jan 30 03:24:56 np0005601978 systemd: Hostname set to <localhost>.
Jan 30 03:24:56 np0005601978 systemd: Initializing machine ID from VM UUID.
Jan 30 03:24:56 np0005601978 systemd: Queued start job for default target Initrd Default Target.
Jan 30 03:24:56 np0005601978 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 30 03:24:56 np0005601978 systemd: Reached target Local Encrypted Volumes.
Jan 30 03:24:56 np0005601978 systemd: Reached target Initrd /usr File System.
Jan 30 03:24:56 np0005601978 systemd: Reached target Local File Systems.
Jan 30 03:24:56 np0005601978 systemd: Reached target Path Units.
Jan 30 03:24:56 np0005601978 systemd: Reached target Slice Units.
Jan 30 03:24:56 np0005601978 systemd: Reached target Swaps.
Jan 30 03:24:56 np0005601978 systemd: Reached target Timer Units.
Jan 30 03:24:56 np0005601978 systemd: Listening on D-Bus System Message Bus Socket.
Jan 30 03:24:56 np0005601978 systemd: Listening on Journal Socket (/dev/log).
Jan 30 03:24:56 np0005601978 systemd: Listening on Journal Socket.
Jan 30 03:24:56 np0005601978 systemd: Listening on udev Control Socket.
Jan 30 03:24:56 np0005601978 systemd: Listening on udev Kernel Socket.
Jan 30 03:24:56 np0005601978 systemd: Reached target Socket Units.
Jan 30 03:24:56 np0005601978 systemd: Starting Create List of Static Device Nodes...
Jan 30 03:24:56 np0005601978 systemd: Starting Journal Service...
Jan 30 03:24:56 np0005601978 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 30 03:24:56 np0005601978 systemd: Starting Apply Kernel Variables...
Jan 30 03:24:56 np0005601978 systemd: Starting Create System Users...
Jan 30 03:24:56 np0005601978 systemd: Starting Setup Virtual Console...
Jan 30 03:24:56 np0005601978 systemd: Finished Create List of Static Device Nodes.
Jan 30 03:24:56 np0005601978 systemd: Finished Apply Kernel Variables.
Jan 30 03:24:56 np0005601978 systemd: Finished Create System Users.
Jan 30 03:24:56 np0005601978 systemd-journald[304]: Journal started
Jan 30 03:24:56 np0005601978 systemd-journald[304]: Runtime Journal (/run/log/journal/5c6fc8cfa2fa4ff8aeb92a548fe8efbe) is 8.0M, max 153.6M, 145.6M free.
Jan 30 03:24:56 np0005601978 systemd-sysusers[308]: Creating group 'users' with GID 100.
Jan 30 03:24:56 np0005601978 systemd-sysusers[308]: Creating group 'dbus' with GID 81.
Jan 30 03:24:56 np0005601978 systemd: Started Journal Service.
Jan 30 03:24:56 np0005601978 systemd-sysusers[308]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 30 03:24:56 np0005601978 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 30 03:24:56 np0005601978 systemd[1]: Starting Create Volatile Files and Directories...
Jan 30 03:24:56 np0005601978 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 30 03:24:56 np0005601978 systemd[1]: Finished Create Volatile Files and Directories.
Jan 30 03:24:56 np0005601978 systemd[1]: Finished Setup Virtual Console.
Jan 30 03:24:56 np0005601978 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 30 03:24:56 np0005601978 systemd[1]: Starting dracut cmdline hook...
Jan 30 03:24:56 np0005601978 dracut-cmdline[324]: dracut-9 dracut-057-102.git20250818.el9
Jan 30 03:24:56 np0005601978 dracut-cmdline[324]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 30 03:24:56 np0005601978 systemd[1]: Finished dracut cmdline hook.
Jan 30 03:24:56 np0005601978 systemd[1]: Starting dracut pre-udev hook...
Jan 30 03:24:56 np0005601978 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 30 03:24:56 np0005601978 kernel: device-mapper: uevent: version 1.0.3
Jan 30 03:24:56 np0005601978 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 30 03:24:56 np0005601978 kernel: RPC: Registered named UNIX socket transport module.
Jan 30 03:24:56 np0005601978 kernel: RPC: Registered udp transport module.
Jan 30 03:24:56 np0005601978 kernel: RPC: Registered tcp transport module.
Jan 30 03:24:56 np0005601978 kernel: RPC: Registered tcp-with-tls transport module.
Jan 30 03:24:56 np0005601978 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 30 03:24:56 np0005601978 rpc.statd[441]: Version 2.5.4 starting
Jan 30 03:24:56 np0005601978 rpc.statd[441]: Initializing NSM state
Jan 30 03:24:56 np0005601978 rpc.idmapd[446]: Setting log level to 0
Jan 30 03:24:56 np0005601978 systemd[1]: Finished dracut pre-udev hook.
Jan 30 03:24:56 np0005601978 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 30 03:24:56 np0005601978 systemd-udevd[459]: Using default interface naming scheme 'rhel-9.0'.
Jan 30 03:24:56 np0005601978 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 30 03:24:56 np0005601978 systemd[1]: Starting dracut pre-trigger hook...
Jan 30 03:24:56 np0005601978 systemd[1]: Finished dracut pre-trigger hook.
Jan 30 03:24:56 np0005601978 systemd[1]: Starting Coldplug All udev Devices...
Jan 30 03:24:56 np0005601978 systemd[1]: Created slice Slice /system/modprobe.
Jan 30 03:24:56 np0005601978 systemd[1]: Starting Load Kernel Module configfs...
Jan 30 03:24:56 np0005601978 systemd[1]: Finished Coldplug All udev Devices.
Jan 30 03:24:56 np0005601978 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 30 03:24:56 np0005601978 systemd[1]: Finished Load Kernel Module configfs.
Jan 30 03:24:56 np0005601978 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 30 03:24:56 np0005601978 systemd[1]: Reached target Network.
Jan 30 03:24:56 np0005601978 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 30 03:24:56 np0005601978 systemd[1]: Starting dracut initqueue hook...
Jan 30 03:24:56 np0005601978 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 30 03:24:56 np0005601978 systemd-udevd[493]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 03:24:56 np0005601978 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 30 03:24:56 np0005601978 kernel: vda: vda1
Jan 30 03:24:56 np0005601978 systemd[1]: Found device /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Jan 30 03:24:56 np0005601978 kernel: scsi host0: ata_piix
Jan 30 03:24:56 np0005601978 kernel: scsi host1: ata_piix
Jan 30 03:24:56 np0005601978 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 30 03:24:56 np0005601978 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 30 03:24:56 np0005601978 systemd[1]: Reached target Initrd Root Device.
Jan 30 03:24:57 np0005601978 kernel: ata1: found unknown device (class 0)
Jan 30 03:24:57 np0005601978 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 30 03:24:57 np0005601978 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 30 03:24:57 np0005601978 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 30 03:24:57 np0005601978 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 30 03:24:57 np0005601978 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 30 03:24:57 np0005601978 systemd[1]: Mounting Kernel Configuration File System...
Jan 30 03:24:57 np0005601978 systemd[1]: Mounted Kernel Configuration File System.
Jan 30 03:24:57 np0005601978 systemd[1]: Reached target System Initialization.
Jan 30 03:24:57 np0005601978 systemd[1]: Reached target Basic System.
Jan 30 03:24:57 np0005601978 systemd[1]: Finished dracut initqueue hook.
Jan 30 03:24:57 np0005601978 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 30 03:24:57 np0005601978 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 30 03:24:57 np0005601978 systemd[1]: Reached target Remote File Systems.
Jan 30 03:24:57 np0005601978 systemd[1]: Starting dracut pre-mount hook...
Jan 30 03:24:57 np0005601978 systemd[1]: Finished dracut pre-mount hook.
Jan 30 03:24:57 np0005601978 systemd[1]: Starting File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8...
Jan 30 03:24:57 np0005601978 systemd-fsck[555]: /usr/sbin/fsck.xfs: XFS file system.
Jan 30 03:24:57 np0005601978 systemd[1]: Finished File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Jan 30 03:24:57 np0005601978 systemd[1]: Mounting /sysroot...
Jan 30 03:24:57 np0005601978 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 30 03:24:57 np0005601978 kernel: XFS (vda1): Mounting V5 Filesystem 822f14ea-6e7e-41df-b0d8-fbe282d9ded8
Jan 30 03:24:57 np0005601978 kernel: XFS (vda1): Ending clean mount
Jan 30 03:24:57 np0005601978 systemd[1]: Mounted /sysroot.
Jan 30 03:24:57 np0005601978 systemd[1]: Reached target Initrd Root File System.
Jan 30 03:24:57 np0005601978 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 30 03:24:57 np0005601978 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 30 03:24:57 np0005601978 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 30 03:24:57 np0005601978 systemd[1]: Reached target Initrd File Systems.
Jan 30 03:24:57 np0005601978 systemd[1]: Reached target Initrd Default Target.
Jan 30 03:24:57 np0005601978 systemd[1]: Starting dracut mount hook...
Jan 30 03:24:58 np0005601978 systemd[1]: Finished dracut mount hook.
Jan 30 03:24:58 np0005601978 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 30 03:24:58 np0005601978 rpc.idmapd[446]: exiting on signal 15
Jan 30 03:24:58 np0005601978 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 30 03:24:58 np0005601978 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target Network.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target Timer Units.
Jan 30 03:24:58 np0005601978 systemd[1]: dbus.socket: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 30 03:24:58 np0005601978 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target Initrd Default Target.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target Basic System.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target Initrd Root Device.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target Initrd /usr File System.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target Path Units.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target Remote File Systems.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target Slice Units.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target Socket Units.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target System Initialization.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target Local File Systems.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target Swaps.
Jan 30 03:24:58 np0005601978 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped dracut mount hook.
Jan 30 03:24:58 np0005601978 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped dracut pre-mount hook.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 30 03:24:58 np0005601978 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 30 03:24:58 np0005601978 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped dracut initqueue hook.
Jan 30 03:24:58 np0005601978 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped Apply Kernel Variables.
Jan 30 03:24:58 np0005601978 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 30 03:24:58 np0005601978 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped Coldplug All udev Devices.
Jan 30 03:24:58 np0005601978 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped dracut pre-trigger hook.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 30 03:24:58 np0005601978 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped Setup Virtual Console.
Jan 30 03:24:58 np0005601978 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 30 03:24:58 np0005601978 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 30 03:24:58 np0005601978 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Closed udev Control Socket.
Jan 30 03:24:58 np0005601978 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Closed udev Kernel Socket.
Jan 30 03:24:58 np0005601978 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped dracut pre-udev hook.
Jan 30 03:24:58 np0005601978 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped dracut cmdline hook.
Jan 30 03:24:58 np0005601978 systemd[1]: Starting Cleanup udev Database...
Jan 30 03:24:58 np0005601978 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 30 03:24:58 np0005601978 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 30 03:24:58 np0005601978 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Stopped Create System Users.
Jan 30 03:24:58 np0005601978 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 30 03:24:58 np0005601978 systemd[1]: Finished Cleanup udev Database.
Jan 30 03:24:58 np0005601978 systemd[1]: Reached target Switch Root.
Jan 30 03:24:58 np0005601978 systemd[1]: Starting Switch Root...
Jan 30 03:24:58 np0005601978 systemd[1]: Switching root.
Jan 30 03:24:58 np0005601978 systemd-journald[304]: Journal stopped
Jan 30 03:24:59 np0005601978 systemd-journald: Received SIGTERM from PID 1 (systemd).
Jan 30 03:24:59 np0005601978 kernel: audit: type=1404 audit(1769761498.436:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 30 03:24:59 np0005601978 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 03:24:59 np0005601978 kernel: SELinux:  policy capability open_perms=1
Jan 30 03:24:59 np0005601978 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 03:24:59 np0005601978 kernel: SELinux:  policy capability always_check_network=0
Jan 30 03:24:59 np0005601978 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 03:24:59 np0005601978 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 03:24:59 np0005601978 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 03:24:59 np0005601978 kernel: audit: type=1403 audit(1769761498.529:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 30 03:24:59 np0005601978 systemd: Successfully loaded SELinux policy in 95.941ms.
Jan 30 03:24:59 np0005601978 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 26.052ms.
Jan 30 03:24:59 np0005601978 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 30 03:24:59 np0005601978 systemd: Detected virtualization kvm.
Jan 30 03:24:59 np0005601978 systemd: Detected architecture x86-64.
Jan 30 03:24:59 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 03:24:59 np0005601978 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 30 03:24:59 np0005601978 systemd: Stopped Switch Root.
Jan 30 03:24:59 np0005601978 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 30 03:24:59 np0005601978 systemd: Created slice Slice /system/getty.
Jan 30 03:24:59 np0005601978 systemd: Created slice Slice /system/serial-getty.
Jan 30 03:24:59 np0005601978 systemd: Created slice Slice /system/sshd-keygen.
Jan 30 03:24:59 np0005601978 systemd: Created slice User and Session Slice.
Jan 30 03:24:59 np0005601978 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 30 03:24:59 np0005601978 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 30 03:24:59 np0005601978 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 30 03:24:59 np0005601978 systemd: Reached target Local Encrypted Volumes.
Jan 30 03:24:59 np0005601978 systemd: Stopped target Switch Root.
Jan 30 03:24:59 np0005601978 systemd: Stopped target Initrd File Systems.
Jan 30 03:24:59 np0005601978 systemd: Stopped target Initrd Root File System.
Jan 30 03:24:59 np0005601978 systemd: Reached target Local Integrity Protected Volumes.
Jan 30 03:24:59 np0005601978 systemd: Reached target Path Units.
Jan 30 03:24:59 np0005601978 systemd: Reached target rpc_pipefs.target.
Jan 30 03:24:59 np0005601978 systemd: Reached target Slice Units.
Jan 30 03:24:59 np0005601978 systemd: Reached target Swaps.
Jan 30 03:24:59 np0005601978 systemd: Reached target Local Verity Protected Volumes.
Jan 30 03:24:59 np0005601978 systemd: Listening on RPCbind Server Activation Socket.
Jan 30 03:24:59 np0005601978 systemd: Reached target RPC Port Mapper.
Jan 30 03:24:59 np0005601978 systemd: Listening on Process Core Dump Socket.
Jan 30 03:24:59 np0005601978 systemd: Listening on initctl Compatibility Named Pipe.
Jan 30 03:24:59 np0005601978 systemd: Listening on udev Control Socket.
Jan 30 03:24:59 np0005601978 systemd: Listening on udev Kernel Socket.
Jan 30 03:24:59 np0005601978 systemd: Mounting Huge Pages File System...
Jan 30 03:24:59 np0005601978 systemd: Mounting POSIX Message Queue File System...
Jan 30 03:24:59 np0005601978 systemd: Mounting Kernel Debug File System...
Jan 30 03:24:59 np0005601978 systemd: Mounting Kernel Trace File System...
Jan 30 03:24:59 np0005601978 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 30 03:24:59 np0005601978 systemd: Starting Create List of Static Device Nodes...
Jan 30 03:24:59 np0005601978 systemd: Starting Load Kernel Module configfs...
Jan 30 03:24:59 np0005601978 systemd: Starting Load Kernel Module drm...
Jan 30 03:24:59 np0005601978 systemd: Starting Load Kernel Module efi_pstore...
Jan 30 03:24:59 np0005601978 systemd: Starting Load Kernel Module fuse...
Jan 30 03:24:59 np0005601978 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 30 03:24:59 np0005601978 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 30 03:24:59 np0005601978 systemd: Stopped File System Check on Root Device.
Jan 30 03:24:59 np0005601978 systemd: Stopped Journal Service.
Jan 30 03:24:59 np0005601978 kernel: fuse: init (API version 7.37)
Jan 30 03:24:59 np0005601978 systemd: Starting Journal Service...
Jan 30 03:24:59 np0005601978 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 30 03:24:59 np0005601978 systemd: Starting Generate network units from Kernel command line...
Jan 30 03:24:59 np0005601978 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 30 03:24:59 np0005601978 systemd: Starting Remount Root and Kernel File Systems...
Jan 30 03:24:59 np0005601978 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 30 03:24:59 np0005601978 systemd: Starting Apply Kernel Variables...
Jan 30 03:24:59 np0005601978 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 30 03:24:59 np0005601978 systemd-journald[678]: Journal started
Jan 30 03:24:59 np0005601978 systemd-journald[678]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Jan 30 03:24:59 np0005601978 systemd[1]: Queued start job for default target Multi-User System.
Jan 30 03:24:59 np0005601978 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 30 03:24:59 np0005601978 systemd: Starting Coldplug All udev Devices...
Jan 30 03:24:59 np0005601978 systemd: Started Journal Service.
Jan 30 03:24:59 np0005601978 systemd[1]: Mounted Huge Pages File System.
Jan 30 03:24:59 np0005601978 systemd[1]: Mounted POSIX Message Queue File System.
Jan 30 03:24:59 np0005601978 systemd[1]: Mounted Kernel Debug File System.
Jan 30 03:24:59 np0005601978 kernel: ACPI: bus type drm_connector registered
Jan 30 03:24:59 np0005601978 systemd[1]: Mounted Kernel Trace File System.
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Create List of Static Device Nodes.
Jan 30 03:24:59 np0005601978 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Load Kernel Module configfs.
Jan 30 03:24:59 np0005601978 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Load Kernel Module drm.
Jan 30 03:24:59 np0005601978 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 30 03:24:59 np0005601978 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Load Kernel Module fuse.
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Generate network units from Kernel command line.
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Apply Kernel Variables.
Jan 30 03:24:59 np0005601978 systemd[1]: Mounting FUSE Control File System...
Jan 30 03:24:59 np0005601978 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 30 03:24:59 np0005601978 systemd[1]: Starting Rebuild Hardware Database...
Jan 30 03:24:59 np0005601978 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 30 03:24:59 np0005601978 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 30 03:24:59 np0005601978 systemd[1]: Starting Load/Save OS Random Seed...
Jan 30 03:24:59 np0005601978 systemd[1]: Starting Create System Users...
Jan 30 03:24:59 np0005601978 systemd[1]: Mounted FUSE Control File System.
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Coldplug All udev Devices.
Jan 30 03:24:59 np0005601978 systemd-journald[678]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Jan 30 03:24:59 np0005601978 systemd-journald[678]: Received client request to flush runtime journal.
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Load/Save OS Random Seed.
Jan 30 03:24:59 np0005601978 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Create System Users.
Jan 30 03:24:59 np0005601978 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 30 03:24:59 np0005601978 systemd[1]: Reached target Preparation for Local File Systems.
Jan 30 03:24:59 np0005601978 systemd[1]: Reached target Local File Systems.
Jan 30 03:24:59 np0005601978 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 30 03:24:59 np0005601978 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 30 03:24:59 np0005601978 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 30 03:24:59 np0005601978 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 30 03:24:59 np0005601978 systemd[1]: Starting Automatic Boot Loader Update...
Jan 30 03:24:59 np0005601978 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 30 03:24:59 np0005601978 systemd[1]: Starting Create Volatile Files and Directories...
Jan 30 03:24:59 np0005601978 bootctl[695]: Couldn't find EFI system partition, skipping.
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Automatic Boot Loader Update.
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Create Volatile Files and Directories.
Jan 30 03:24:59 np0005601978 systemd[1]: Starting Security Auditing Service...
Jan 30 03:24:59 np0005601978 systemd[1]: Starting RPC Bind...
Jan 30 03:24:59 np0005601978 systemd[1]: Starting Rebuild Journal Catalog...
Jan 30 03:24:59 np0005601978 auditd[701]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 30 03:24:59 np0005601978 auditd[701]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Rebuild Journal Catalog.
Jan 30 03:24:59 np0005601978 augenrules[706]: /sbin/augenrules: No change
Jan 30 03:24:59 np0005601978 systemd[1]: Started RPC Bind.
Jan 30 03:24:59 np0005601978 augenrules[721]: No rules
Jan 30 03:24:59 np0005601978 augenrules[721]: enabled 1
Jan 30 03:24:59 np0005601978 augenrules[721]: failure 1
Jan 30 03:24:59 np0005601978 augenrules[721]: pid 701
Jan 30 03:24:59 np0005601978 augenrules[721]: rate_limit 0
Jan 30 03:24:59 np0005601978 augenrules[721]: backlog_limit 8192
Jan 30 03:24:59 np0005601978 augenrules[721]: lost 0
Jan 30 03:24:59 np0005601978 augenrules[721]: backlog 3
Jan 30 03:24:59 np0005601978 augenrules[721]: backlog_wait_time 60000
Jan 30 03:24:59 np0005601978 augenrules[721]: backlog_wait_time_actual 0
Jan 30 03:24:59 np0005601978 augenrules[721]: enabled 1
Jan 30 03:24:59 np0005601978 augenrules[721]: failure 1
Jan 30 03:24:59 np0005601978 augenrules[721]: pid 701
Jan 30 03:24:59 np0005601978 augenrules[721]: rate_limit 0
Jan 30 03:24:59 np0005601978 augenrules[721]: backlog_limit 8192
Jan 30 03:24:59 np0005601978 augenrules[721]: lost 0
Jan 30 03:24:59 np0005601978 augenrules[721]: backlog 3
Jan 30 03:24:59 np0005601978 augenrules[721]: backlog_wait_time 60000
Jan 30 03:24:59 np0005601978 augenrules[721]: backlog_wait_time_actual 0
Jan 30 03:24:59 np0005601978 augenrules[721]: enabled 1
Jan 30 03:24:59 np0005601978 augenrules[721]: failure 1
Jan 30 03:24:59 np0005601978 augenrules[721]: pid 701
Jan 30 03:24:59 np0005601978 augenrules[721]: rate_limit 0
Jan 30 03:24:59 np0005601978 augenrules[721]: backlog_limit 8192
Jan 30 03:24:59 np0005601978 augenrules[721]: lost 0
Jan 30 03:24:59 np0005601978 augenrules[721]: backlog 3
Jan 30 03:24:59 np0005601978 augenrules[721]: backlog_wait_time 60000
Jan 30 03:24:59 np0005601978 augenrules[721]: backlog_wait_time_actual 0
Jan 30 03:24:59 np0005601978 systemd[1]: Started Security Auditing Service.
Jan 30 03:24:59 np0005601978 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Rebuild Hardware Database.
Jan 30 03:24:59 np0005601978 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 30 03:24:59 np0005601978 systemd-udevd[729]: Using default interface naming scheme 'rhel-9.0'.
Jan 30 03:24:59 np0005601978 systemd[1]: Starting Update is Completed...
Jan 30 03:24:59 np0005601978 systemd[1]: Finished Update is Completed.
Jan 30 03:24:59 np0005601978 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 30 03:24:59 np0005601978 systemd[1]: Reached target System Initialization.
Jan 30 03:24:59 np0005601978 systemd[1]: Started dnf makecache --timer.
Jan 30 03:24:59 np0005601978 systemd[1]: Started Daily rotation of log files.
Jan 30 03:24:59 np0005601978 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 30 03:24:59 np0005601978 systemd[1]: Reached target Timer Units.
Jan 30 03:24:59 np0005601978 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 30 03:24:59 np0005601978 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 30 03:24:59 np0005601978 systemd[1]: Reached target Socket Units.
Jan 30 03:25:00 np0005601978 systemd[1]: Starting D-Bus System Message Bus...
Jan 30 03:25:00 np0005601978 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 30 03:25:00 np0005601978 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 30 03:25:00 np0005601978 systemd[1]: Starting Load Kernel Module configfs...
Jan 30 03:25:00 np0005601978 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 30 03:25:00 np0005601978 systemd[1]: Finished Load Kernel Module configfs.
Jan 30 03:25:00 np0005601978 systemd-udevd[740]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 03:25:00 np0005601978 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 30 03:25:00 np0005601978 systemd[1]: Started D-Bus System Message Bus.
Jan 30 03:25:00 np0005601978 systemd[1]: Reached target Basic System.
Jan 30 03:25:00 np0005601978 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 30 03:25:00 np0005601978 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 30 03:25:00 np0005601978 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 30 03:25:00 np0005601978 dbus-broker-lau[760]: Ready
Jan 30 03:25:00 np0005601978 systemd[1]: Starting NTP client/server...
Jan 30 03:25:00 np0005601978 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 30 03:25:00 np0005601978 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 30 03:25:00 np0005601978 systemd[1]: Starting IPv4 firewall with iptables...
Jan 30 03:25:00 np0005601978 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 30 03:25:00 np0005601978 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 30 03:25:00 np0005601978 systemd[1]: Started irqbalance daemon.
Jan 30 03:25:00 np0005601978 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 30 03:25:00 np0005601978 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 30 03:25:00 np0005601978 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 30 03:25:00 np0005601978 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 30 03:25:00 np0005601978 systemd[1]: Reached target sshd-keygen.target.
Jan 30 03:25:00 np0005601978 kernel: Console: switching to colour dummy device 80x25
Jan 30 03:25:00 np0005601978 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 30 03:25:00 np0005601978 kernel: [drm] features: -context_init
Jan 30 03:25:00 np0005601978 kernel: [drm] number of scanouts: 1
Jan 30 03:25:00 np0005601978 kernel: [drm] number of cap sets: 0
Jan 30 03:25:00 np0005601978 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 30 03:25:00 np0005601978 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 30 03:25:00 np0005601978 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 30 03:25:00 np0005601978 systemd[1]: Reached target User and Group Name Lookups.
Jan 30 03:25:00 np0005601978 kernel: Console: switching to colour frame buffer device 128x48
Jan 30 03:25:00 np0005601978 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 30 03:25:00 np0005601978 chronyd[799]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 30 03:25:00 np0005601978 chronyd[799]: Loaded 0 symmetric keys
Jan 30 03:25:00 np0005601978 chronyd[799]: Using right/UTC timezone to obtain leap second data
Jan 30 03:25:00 np0005601978 chronyd[799]: Loaded seccomp filter (level 2)
Jan 30 03:25:00 np0005601978 systemd[1]: Starting User Login Management...
Jan 30 03:25:00 np0005601978 systemd[1]: Started NTP client/server.
Jan 30 03:25:00 np0005601978 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 30 03:25:00 np0005601978 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 30 03:25:00 np0005601978 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 30 03:25:00 np0005601978 systemd-logind[793]: New seat seat0.
Jan 30 03:25:00 np0005601978 systemd-logind[793]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 30 03:25:00 np0005601978 systemd-logind[793]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 30 03:25:00 np0005601978 systemd[1]: Started User Login Management.
Jan 30 03:25:00 np0005601978 kernel: kvm_amd: TSC scaling supported
Jan 30 03:25:00 np0005601978 kernel: kvm_amd: Nested Virtualization enabled
Jan 30 03:25:00 np0005601978 kernel: kvm_amd: Nested Paging enabled
Jan 30 03:25:00 np0005601978 kernel: kvm_amd: LBR virtualization supported
Jan 30 03:25:00 np0005601978 iptables.init[778]: iptables: Applying firewall rules: [  OK  ]
Jan 30 03:25:00 np0005601978 systemd[1]: Finished IPv4 firewall with iptables.
Jan 30 03:25:01 np0005601978 cloud-init[837]: Cloud-init v. 24.4-8.el9 running 'init-local' at Fri, 30 Jan 2026 08:25:00 +0000. Up 6.50 seconds.
Jan 30 03:25:01 np0005601978 systemd[1]: run-cloud\x2dinit-tmp-tmpvunxb9tr.mount: Deactivated successfully.
Jan 30 03:25:01 np0005601978 systemd[1]: Starting Hostname Service...
Jan 30 03:25:01 np0005601978 systemd[1]: Started Hostname Service.
Jan 30 03:25:01 np0005601978 systemd-hostnamed[852]: Hostname set to <np0005601978.novalocal> (static)
Jan 30 03:25:01 np0005601978 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 30 03:25:01 np0005601978 systemd[1]: Reached target Preparation for Network.
Jan 30 03:25:01 np0005601978 systemd[1]: Starting Network Manager...
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7226] NetworkManager (version 1.54.3-2.el9) is starting... (boot:85b9e2c8-8235-417d-81be-396eb2d5c232)
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7230] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7393] manager[0x564f01222000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7471] hostname: hostname: using hostnamed
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7471] hostname: static hostname changed from (none) to "np0005601978.novalocal"
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7477] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7645] manager[0x564f01222000]: rfkill: Wi-Fi hardware radio set enabled
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7648] manager[0x564f01222000]: rfkill: WWAN hardware radio set enabled
Jan 30 03:25:01 np0005601978 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7745] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7745] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7746] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7747] manager: Networking is enabled by state file
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7750] settings: Loaded settings plugin: keyfile (internal)
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7786] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7811] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7824] dhcp: init: Using DHCP client 'internal'
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7828] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7841] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7853] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7865] device (lo): Activation: starting connection 'lo' (33b2f94f-fa01-4cf4-b364-8f6cc69c6981)
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7873] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7876] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 03:25:01 np0005601978 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 30 03:25:01 np0005601978 systemd[1]: Started Network Manager.
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7922] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7926] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7928] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7930] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7936] device (eth0): carrier: link connected
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7940] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 30 03:25:01 np0005601978 systemd[1]: Reached target Network.
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7948] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7959] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7964] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7966] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7969] manager: NetworkManager state is now CONNECTING
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7972] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 03:25:01 np0005601978 systemd[1]: Starting Network Manager Wait Online...
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7980] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.7985] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 30 03:25:01 np0005601978 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 30 03:25:01 np0005601978 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.8118] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.8121] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.8133] device (lo): Activation: successful, device activated.
Jan 30 03:25:01 np0005601978 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 30 03:25:01 np0005601978 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 30 03:25:01 np0005601978 systemd[1]: Reached target NFS client services.
Jan 30 03:25:01 np0005601978 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 30 03:25:01 np0005601978 systemd[1]: Reached target Remote File Systems.
Jan 30 03:25:01 np0005601978 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.9120] dhcp4 (eth0): state changed new lease, address=38.102.83.136
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.9133] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.9157] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.9187] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.9188] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.9191] manager: NetworkManager state is now CONNECTED_SITE
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.9194] device (eth0): Activation: successful, device activated.
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.9202] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 30 03:25:01 np0005601978 NetworkManager[856]: <info>  [1769761501.9205] manager: startup complete
Jan 30 03:25:01 np0005601978 systemd[1]: Finished Network Manager Wait Online.
Jan 30 03:25:01 np0005601978 systemd[1]: Starting Cloud-init: Network Stage...
Jan 30 03:25:02 np0005601978 cloud-init[919]: Cloud-init v. 24.4-8.el9 running 'init' at Fri, 30 Jan 2026 08:25:02 +0000. Up 7.70 seconds.
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: |  eth0  | True |        38.102.83.136         | 255.255.255.0 | global | fa:16:3e:ff:d3:ca |
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: |  eth0  | True | fe80::f816:3eff:feff:d3ca/64 |       .       |  link  | fa:16:3e:ff:d3:ca |
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 30 03:25:02 np0005601978 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 30 03:25:03 np0005601978 cloud-init[919]: Generating public/private rsa key pair.
Jan 30 03:25:03 np0005601978 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 30 03:25:03 np0005601978 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 30 03:25:03 np0005601978 cloud-init[919]: The key fingerprint is:
Jan 30 03:25:03 np0005601978 cloud-init[919]: SHA256:ylEMKhi39LdckgyM8mRMQYL85HKUJwqEwsXqNhu7y9g root@np0005601978.novalocal
Jan 30 03:25:03 np0005601978 cloud-init[919]: The key's randomart image is:
Jan 30 03:25:03 np0005601978 cloud-init[919]: +---[RSA 3072]----+
Jan 30 03:25:03 np0005601978 cloud-init[919]: |O=O=o .          |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |*O=O.= +         |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |+=X = = +        |
Jan 30 03:25:03 np0005601978 cloud-init[919]: | +.= o =         |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |. o   + S        |
Jan 30 03:25:03 np0005601978 cloud-init[919]: | =   . o         |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |. =   o          |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |o+               |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |.+E              |
Jan 30 03:25:03 np0005601978 cloud-init[919]: +----[SHA256]-----+
Jan 30 03:25:03 np0005601978 cloud-init[919]: Generating public/private ecdsa key pair.
Jan 30 03:25:03 np0005601978 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 30 03:25:03 np0005601978 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 30 03:25:03 np0005601978 cloud-init[919]: The key fingerprint is:
Jan 30 03:25:03 np0005601978 cloud-init[919]: SHA256:vkh3gmNA92lgw5CptfGQguwsAaBSHagWvYcFklUtVAI root@np0005601978.novalocal
Jan 30 03:25:03 np0005601978 cloud-init[919]: The key's randomart image is:
Jan 30 03:25:03 np0005601978 cloud-init[919]: +---[ECDSA 256]---+
Jan 30 03:25:03 np0005601978 cloud-init[919]: |*o=EBOo.         |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |+=+.O=..         |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |=o.+==B          |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |++.+.+.+ .       |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |o   o   S        |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |     . +         |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |      = + .      |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |     o + +       |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |      . .        |
Jan 30 03:25:03 np0005601978 cloud-init[919]: +----[SHA256]-----+
Jan 30 03:25:03 np0005601978 cloud-init[919]: Generating public/private ed25519 key pair.
Jan 30 03:25:03 np0005601978 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 30 03:25:03 np0005601978 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 30 03:25:03 np0005601978 cloud-init[919]: The key fingerprint is:
Jan 30 03:25:03 np0005601978 cloud-init[919]: SHA256:SlQMXImiGkCqokOhRoGsLsxaA/YQqpAZDrPVxA1NiE0 root@np0005601978.novalocal
Jan 30 03:25:03 np0005601978 cloud-init[919]: The key's randomart image is:
Jan 30 03:25:03 np0005601978 cloud-init[919]: +--[ED25519 256]--+
Jan 30 03:25:03 np0005601978 cloud-init[919]: |+o BEBo=o.       |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |Bo+ * =.o        |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |BX.. ..          |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |&+o  .           |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |@*o   . S        |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |O+o. . .         |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |=. .  .          |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |..               |
Jan 30 03:25:03 np0005601978 cloud-init[919]: |                 |
Jan 30 03:25:03 np0005601978 cloud-init[919]: +----[SHA256]-----+
Jan 30 03:25:03 np0005601978 sm-notify[1002]: Version 2.5.4 starting
Jan 30 03:25:03 np0005601978 systemd[1]: Finished Cloud-init: Network Stage.
Jan 30 03:25:03 np0005601978 systemd[1]: Reached target Cloud-config availability.
Jan 30 03:25:03 np0005601978 systemd[1]: Reached target Network is Online.
Jan 30 03:25:03 np0005601978 systemd[1]: Starting Cloud-init: Config Stage...
Jan 30 03:25:03 np0005601978 systemd[1]: Starting Crash recovery kernel arming...
Jan 30 03:25:03 np0005601978 systemd[1]: Starting Notify NFS peers of a restart...
Jan 30 03:25:03 np0005601978 systemd[1]: Starting System Logging Service...
Jan 30 03:25:03 np0005601978 systemd[1]: Starting OpenSSH server daemon...
Jan 30 03:25:03 np0005601978 systemd[1]: Starting Permit User Sessions...
Jan 30 03:25:03 np0005601978 systemd[1]: Started Notify NFS peers of a restart.
Jan 30 03:25:03 np0005601978 systemd[1]: Finished Permit User Sessions.
Jan 30 03:25:03 np0005601978 systemd[1]: Started OpenSSH server daemon.
Jan 30 03:25:03 np0005601978 systemd[1]: Started Command Scheduler.
Jan 30 03:25:03 np0005601978 systemd[1]: Started Getty on tty1.
Jan 30 03:25:03 np0005601978 systemd[1]: Started Serial Getty on ttyS0.
Jan 30 03:25:03 np0005601978 systemd[1]: Reached target Login Prompts.
Jan 30 03:25:03 np0005601978 rsyslogd[1003]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1003" x-info="https://www.rsyslog.com"] start
Jan 30 03:25:03 np0005601978 systemd[1]: Started System Logging Service.
Jan 30 03:25:03 np0005601978 rsyslogd[1003]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 30 03:25:03 np0005601978 systemd[1]: Reached target Multi-User System.
Jan 30 03:25:03 np0005601978 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 30 03:25:03 np0005601978 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 30 03:25:03 np0005601978 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 30 03:25:03 np0005601978 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 03:25:04 np0005601978 kdumpctl[1017]: kdump: No kdump initial ramdisk found.
Jan 30 03:25:04 np0005601978 kdumpctl[1017]: kdump: Rebuilding /boot/initramfs-5.14.0-665.el9.x86_64kdump.img
Jan 30 03:25:04 np0005601978 cloud-init[1082]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Fri, 30 Jan 2026 08:25:04 +0000. Up 9.52 seconds.
Jan 30 03:25:04 np0005601978 systemd[1]: Finished Cloud-init: Config Stage.
Jan 30 03:25:04 np0005601978 systemd[1]: Starting Cloud-init: Final Stage...
Jan 30 03:25:04 np0005601978 dracut[1266]: dracut-057-102.git20250818.el9
Jan 30 03:25:04 np0005601978 cloud-init[1267]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Fri, 30 Jan 2026 08:25:04 +0000. Up 9.94 seconds.
Jan 30 03:25:04 np0005601978 cloud-init[1284]: #############################################################
Jan 30 03:25:04 np0005601978 cloud-init[1285]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 30 03:25:04 np0005601978 cloud-init[1287]: 256 SHA256:vkh3gmNA92lgw5CptfGQguwsAaBSHagWvYcFklUtVAI root@np0005601978.novalocal (ECDSA)
Jan 30 03:25:04 np0005601978 cloud-init[1289]: 256 SHA256:SlQMXImiGkCqokOhRoGsLsxaA/YQqpAZDrPVxA1NiE0 root@np0005601978.novalocal (ED25519)
Jan 30 03:25:04 np0005601978 cloud-init[1291]: 3072 SHA256:ylEMKhi39LdckgyM8mRMQYL85HKUJwqEwsXqNhu7y9g root@np0005601978.novalocal (RSA)
Jan 30 03:25:04 np0005601978 cloud-init[1292]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 30 03:25:04 np0005601978 cloud-init[1293]: #############################################################
Jan 30 03:25:04 np0005601978 cloud-init[1267]: Cloud-init v. 24.4-8.el9 finished at Fri, 30 Jan 2026 08:25:04 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.12 seconds
Jan 30 03:25:04 np0005601978 dracut[1269]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-665.el9.x86_64kdump.img 5.14.0-665.el9.x86_64
Jan 30 03:25:04 np0005601978 systemd[1]: Finished Cloud-init: Final Stage.
Jan 30 03:25:04 np0005601978 systemd[1]: Reached target Cloud-init target.
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: memstrack is not available
Jan 30 03:25:05 np0005601978 dracut[1269]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 30 03:25:05 np0005601978 dracut[1269]: memstrack is not available
Jan 30 03:25:05 np0005601978 dracut[1269]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 30 03:25:06 np0005601978 dracut[1269]: *** Including module: systemd ***
Jan 30 03:25:06 np0005601978 dracut[1269]: *** Including module: fips ***
Jan 30 03:25:06 np0005601978 dracut[1269]: *** Including module: systemd-initrd ***
Jan 30 03:25:06 np0005601978 dracut[1269]: *** Including module: i18n ***
Jan 30 03:25:06 np0005601978 dracut[1269]: *** Including module: drm ***
Jan 30 03:25:06 np0005601978 chronyd[799]: Selected source 144.217.93.2 (2.centos.pool.ntp.org)
Jan 30 03:25:06 np0005601978 chronyd[799]: System clock TAI offset set to 37 seconds
Jan 30 03:25:07 np0005601978 dracut[1269]: *** Including module: prefixdevname ***
Jan 30 03:25:07 np0005601978 dracut[1269]: *** Including module: kernel-modules ***
Jan 30 03:25:07 np0005601978 kernel: block vda: the capability attribute has been deprecated.
Jan 30 03:25:07 np0005601978 dracut[1269]: *** Including module: kernel-modules-extra ***
Jan 30 03:25:07 np0005601978 dracut[1269]: *** Including module: qemu ***
Jan 30 03:25:08 np0005601978 dracut[1269]: *** Including module: fstab-sys ***
Jan 30 03:25:08 np0005601978 dracut[1269]: *** Including module: rootfs-block ***
Jan 30 03:25:08 np0005601978 dracut[1269]: *** Including module: terminfo ***
Jan 30 03:25:08 np0005601978 dracut[1269]: *** Including module: udev-rules ***
Jan 30 03:25:08 np0005601978 dracut[1269]: Skipping udev rule: 91-permissions.rules
Jan 30 03:25:08 np0005601978 dracut[1269]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 30 03:25:08 np0005601978 dracut[1269]: *** Including module: virtiofs ***
Jan 30 03:25:08 np0005601978 dracut[1269]: *** Including module: dracut-systemd ***
Jan 30 03:25:09 np0005601978 dracut[1269]: *** Including module: usrmount ***
Jan 30 03:25:09 np0005601978 dracut[1269]: *** Including module: base ***
Jan 30 03:25:09 np0005601978 dracut[1269]: *** Including module: fs-lib ***
Jan 30 03:25:09 np0005601978 dracut[1269]: *** Including module: kdumpbase ***
Jan 30 03:25:09 np0005601978 dracut[1269]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 30 03:25:09 np0005601978 dracut[1269]:  microcode_ctl module: mangling fw_dir
Jan 30 03:25:09 np0005601978 dracut[1269]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 30 03:25:09 np0005601978 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 30 03:25:09 np0005601978 dracut[1269]:    microcode_ctl: configuration "intel" is ignored
Jan 30 03:25:09 np0005601978 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 30 03:25:09 np0005601978 dracut[1269]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 30 03:25:09 np0005601978 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 30 03:25:09 np0005601978 dracut[1269]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 30 03:25:09 np0005601978 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 30 03:25:09 np0005601978 dracut[1269]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 30 03:25:09 np0005601978 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 30 03:25:09 np0005601978 dracut[1269]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 30 03:25:09 np0005601978 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 30 03:25:09 np0005601978 dracut[1269]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 30 03:25:09 np0005601978 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 30 03:25:10 np0005601978 dracut[1269]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 30 03:25:10 np0005601978 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 30 03:25:10 np0005601978 dracut[1269]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 30 03:25:10 np0005601978 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 30 03:25:10 np0005601978 dracut[1269]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 30 03:25:10 np0005601978 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 30 03:25:10 np0005601978 dracut[1269]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 30 03:25:10 np0005601978 dracut[1269]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 30 03:25:10 np0005601978 dracut[1269]: *** Including module: openssl ***
Jan 30 03:25:10 np0005601978 dracut[1269]: *** Including module: shutdown ***
Jan 30 03:25:10 np0005601978 dracut[1269]: *** Including module: squash ***
Jan 30 03:25:10 np0005601978 dracut[1269]: *** Including modules done ***
Jan 30 03:25:10 np0005601978 dracut[1269]: *** Installing kernel module dependencies ***
Jan 30 03:25:10 np0005601978 irqbalance[783]: Cannot change IRQ 35 affinity: Operation not permitted
Jan 30 03:25:10 np0005601978 irqbalance[783]: IRQ 35 affinity is now unmanaged
Jan 30 03:25:10 np0005601978 irqbalance[783]: Cannot change IRQ 33 affinity: Operation not permitted
Jan 30 03:25:10 np0005601978 irqbalance[783]: IRQ 33 affinity is now unmanaged
Jan 30 03:25:10 np0005601978 irqbalance[783]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 30 03:25:10 np0005601978 irqbalance[783]: IRQ 31 affinity is now unmanaged
Jan 30 03:25:10 np0005601978 irqbalance[783]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 30 03:25:10 np0005601978 irqbalance[783]: IRQ 28 affinity is now unmanaged
Jan 30 03:25:10 np0005601978 irqbalance[783]: Cannot change IRQ 34 affinity: Operation not permitted
Jan 30 03:25:10 np0005601978 irqbalance[783]: IRQ 34 affinity is now unmanaged
Jan 30 03:25:10 np0005601978 irqbalance[783]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 30 03:25:10 np0005601978 irqbalance[783]: IRQ 32 affinity is now unmanaged
Jan 30 03:25:10 np0005601978 irqbalance[783]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 30 03:25:10 np0005601978 irqbalance[783]: IRQ 30 affinity is now unmanaged
Jan 30 03:25:10 np0005601978 irqbalance[783]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 30 03:25:10 np0005601978 irqbalance[783]: IRQ 29 affinity is now unmanaged
Jan 30 03:25:11 np0005601978 dracut[1269]: *** Installing kernel module dependencies done ***
Jan 30 03:25:11 np0005601978 dracut[1269]: *** Resolving executable dependencies ***
Jan 30 03:25:11 np0005601978 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 30 03:25:12 np0005601978 dracut[1269]: *** Resolving executable dependencies done ***
Jan 30 03:25:13 np0005601978 dracut[1269]: *** Generating early-microcode cpio image ***
Jan 30 03:25:13 np0005601978 dracut[1269]: *** Store current command line parameters ***
Jan 30 03:25:13 np0005601978 dracut[1269]: Stored kernel commandline:
Jan 30 03:25:13 np0005601978 dracut[1269]: No dracut internal kernel commandline stored in the initramfs
Jan 30 03:25:13 np0005601978 dracut[1269]: *** Install squash loader ***
Jan 30 03:25:14 np0005601978 dracut[1269]: *** Squashing the files inside the initramfs ***
Jan 30 03:25:15 np0005601978 dracut[1269]: *** Squashing the files inside the initramfs done ***
Jan 30 03:25:15 np0005601978 dracut[1269]: *** Creating image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' ***
Jan 30 03:25:15 np0005601978 dracut[1269]: *** Hardlinking files ***
Jan 30 03:25:15 np0005601978 dracut[1269]: *** Hardlinking files done ***
Jan 30 03:25:15 np0005601978 dracut[1269]: *** Creating initramfs image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' done ***
Jan 30 03:25:16 np0005601978 kdumpctl[1017]: kdump: kexec: loaded kdump kernel
Jan 30 03:25:16 np0005601978 kdumpctl[1017]: kdump: Starting kdump: [OK]
Jan 30 03:25:16 np0005601978 systemd[1]: Finished Crash recovery kernel arming.
Jan 30 03:25:16 np0005601978 systemd[1]: Startup finished in 1.440s (kernel) + 2.495s (initrd) + 18.061s (userspace) = 21.997s.
Jan 30 03:25:17 np0005601978 systemd[1]: Created slice User Slice of UID 1000.
Jan 30 03:25:17 np0005601978 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 30 03:25:17 np0005601978 systemd-logind[793]: New session 1 of user zuul.
Jan 30 03:25:17 np0005601978 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 30 03:25:17 np0005601978 systemd[1]: Starting User Manager for UID 1000...
Jan 30 03:25:18 np0005601978 systemd[4304]: Queued start job for default target Main User Target.
Jan 30 03:25:18 np0005601978 systemd[4304]: Created slice User Application Slice.
Jan 30 03:25:18 np0005601978 systemd[4304]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 30 03:25:18 np0005601978 systemd[4304]: Started Daily Cleanup of User's Temporary Directories.
Jan 30 03:25:18 np0005601978 systemd[4304]: Reached target Paths.
Jan 30 03:25:18 np0005601978 systemd[4304]: Reached target Timers.
Jan 30 03:25:18 np0005601978 systemd[4304]: Starting D-Bus User Message Bus Socket...
Jan 30 03:25:18 np0005601978 systemd[4304]: Starting Create User's Volatile Files and Directories...
Jan 30 03:25:18 np0005601978 systemd[4304]: Finished Create User's Volatile Files and Directories.
Jan 30 03:25:18 np0005601978 systemd[4304]: Listening on D-Bus User Message Bus Socket.
Jan 30 03:25:18 np0005601978 systemd[4304]: Reached target Sockets.
Jan 30 03:25:18 np0005601978 systemd[4304]: Reached target Basic System.
Jan 30 03:25:18 np0005601978 systemd[4304]: Reached target Main User Target.
Jan 30 03:25:18 np0005601978 systemd[4304]: Startup finished in 112ms.
Jan 30 03:25:18 np0005601978 systemd[1]: Started User Manager for UID 1000.
Jan 30 03:25:18 np0005601978 systemd[1]: Started Session 1 of User zuul.
Jan 30 03:25:18 np0005601978 python3[4386]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:25:22 np0005601978 python3[4414]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:25:29 np0005601978 python3[4472]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:25:30 np0005601978 python3[4512]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 30 03:25:31 np0005601978 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 30 03:25:32 np0005601978 python3[4540]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDbK+QNrWBUptwh/tsKUHyiCtL2MOslPM6Ok8BMCQo6bFrbUd3b41fHCXLJAKXUhQCi1jGZcObKr9aNeuB2QNxE8xc5bAMGWovKQ4u31cSz3+yCNPmCHNuVIMM7SCn/3SHL9lx+Mlgvr5y4LrvTxeqs+jgjWMwjgcOCmuiCK3sN+5XaVsPM8J8Q3hJc4oPZcz0m7hnNHwPiUCmUZ8/Fa3AZ2CT2rKka37F4HKzmhoCcGuEDcWJYfniS+jSLHe6v94wfn4yfX8Cni+Tg7PMFKfVoNp4bzlsWo7CJR9gC2d3Deo3PclTA3bdiQrDcq142qOGd9C16Ts0gaa5MtZXkkV8REKdfVwGoP8oDcePun1Bvrh+9fpQI050OnRQesD5MVe8uRKX9Li/HPu9YLN3L05zyP8HFNcli3m9jYS5EIwchiexzHMIP61qOGZEgGeKbRzB9UHRhepfxoBwFNaC77XNrSZoJIaWdzxtdC7LWxgN3HBIIXEk/1tRf6BVUZfU7Usk= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:32 np0005601978 python3[4564]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:33 np0005601978 python3[4663]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:25:33 np0005601978 python3[4734]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769761533.1773481-252-81703562777729/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=6b9a517215ca414698aa6ebfe0ccc07e_id_rsa follow=False checksum=e031682c4cdaec7fe3b6bfdb180dc112513e48ea backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:34 np0005601978 python3[4857]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:25:34 np0005601978 python3[4928]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769761534.064554-307-185106767976121/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=6b9a517215ca414698aa6ebfe0ccc07e_id_rsa.pub follow=False checksum=605641a03b6ed0a70ecc2d0ac2aeac1b02553754 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:35 np0005601978 python3[4976]: ansible-ping Invoked with data=pong
Jan 30 03:25:37 np0005601978 python3[5000]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:25:40 np0005601978 python3[5058]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 30 03:25:41 np0005601978 python3[5090]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:41 np0005601978 python3[5114]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:41 np0005601978 python3[5138]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:42 np0005601978 python3[5162]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:42 np0005601978 python3[5186]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:42 np0005601978 python3[5210]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:44 np0005601978 python3[5236]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:45 np0005601978 python3[5314]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:25:46 np0005601978 python3[5387]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769761545.0431693-33-163255257526519/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:46 np0005601978 python3[5435]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:46 np0005601978 python3[5459]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:47 np0005601978 python3[5483]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:47 np0005601978 python3[5507]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:47 np0005601978 python3[5531]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:48 np0005601978 python3[5555]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:48 np0005601978 python3[5579]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:48 np0005601978 python3[5603]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:48 np0005601978 python3[5627]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:49 np0005601978 python3[5651]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:49 np0005601978 python3[5675]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:49 np0005601978 python3[5699]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:49 np0005601978 python3[5723]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:50 np0005601978 python3[5747]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:50 np0005601978 python3[5771]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:50 np0005601978 python3[5795]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:51 np0005601978 python3[5819]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:51 np0005601978 python3[5843]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:51 np0005601978 python3[5867]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:51 np0005601978 python3[5891]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:52 np0005601978 python3[5915]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:52 np0005601978 python3[5939]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:52 np0005601978 python3[5963]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:53 np0005601978 python3[5987]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:53 np0005601978 python3[6011]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:53 np0005601978 python3[6035]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:25:56 np0005601978 python3[6061]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 30 03:25:56 np0005601978 systemd[1]: Starting Time & Date Service...
Jan 30 03:25:56 np0005601978 systemd[1]: Started Time & Date Service.
Jan 30 03:25:56 np0005601978 systemd-timedated[6063]: Changed time zone to 'UTC' (UTC).
Jan 30 03:25:57 np0005601978 python3[6092]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:57 np0005601978 python3[6168]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:25:58 np0005601978 python3[6239]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769761557.5422385-252-240605501661092/source _original_basename=tmppd3qrwjs follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:25:58 np0005601978 python3[6339]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:25:59 np0005601978 python3[6410]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769761558.5382187-303-6776137862824/source _original_basename=tmp2krkmtuc follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:26:00 np0005601978 python3[6512]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:26:00 np0005601978 python3[6585]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769761559.7391772-382-263362533466994/source _original_basename=tmpgopfy7bo follow=False checksum=dbfbf8d503ec9dc3cc74020a84f16a6d71ecb5c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:26:01 np0005601978 python3[6633]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:26:01 np0005601978 python3[6659]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:26:01 np0005601978 python3[6739]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:26:02 np0005601978 python3[6812]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769761561.5138934-452-105856802438305/source _original_basename=tmpmowh4cgw follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:26:02 np0005601978 python3[6863]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-ab5a-d783-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:26:03 np0005601978 python3[6891]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-ab5a-d783-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 30 03:26:04 np0005601978 python3[6919]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:26:22 np0005601978 python3[6945]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:26:27 np0005601978 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 30 03:27:22 np0005601978 systemd-logind[793]: Session 1 logged out. Waiting for processes to exit.
Jan 30 03:27:25 np0005601978 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 30 03:27:25 np0005601978 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 30 03:27:25 np0005601978 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 30 03:27:25 np0005601978 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 30 03:27:25 np0005601978 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 30 03:27:25 np0005601978 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 30 03:27:25 np0005601978 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 30 03:27:25 np0005601978 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 30 03:27:25 np0005601978 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 30 03:27:25 np0005601978 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 30 03:27:25 np0005601978 NetworkManager[856]: <info>  [1769761645.8924] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 30 03:27:25 np0005601978 systemd-udevd[6948]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 03:27:25 np0005601978 NetworkManager[856]: <info>  [1769761645.9108] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 03:27:25 np0005601978 NetworkManager[856]: <info>  [1769761645.9148] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 30 03:27:25 np0005601978 NetworkManager[856]: <info>  [1769761645.9155] device (eth1): carrier: link connected
Jan 30 03:27:25 np0005601978 NetworkManager[856]: <info>  [1769761645.9159] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 30 03:27:25 np0005601978 NetworkManager[856]: <info>  [1769761645.9170] policy: auto-activating connection 'Wired connection 1' (0fdac447-07b1-3cb9-a4f0-3c44589ad525)
Jan 30 03:27:25 np0005601978 NetworkManager[856]: <info>  [1769761645.9179] device (eth1): Activation: starting connection 'Wired connection 1' (0fdac447-07b1-3cb9-a4f0-3c44589ad525)
Jan 30 03:27:25 np0005601978 NetworkManager[856]: <info>  [1769761645.9181] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 03:27:25 np0005601978 NetworkManager[856]: <info>  [1769761645.9189] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 03:27:25 np0005601978 NetworkManager[856]: <info>  [1769761645.9196] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 03:27:25 np0005601978 NetworkManager[856]: <info>  [1769761645.9204] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 30 03:27:25 np0005601978 systemd[4304]: Starting Mark boot as successful...
Jan 30 03:27:25 np0005601978 systemd[4304]: Finished Mark boot as successful.
Jan 30 03:27:26 np0005601978 systemd-logind[793]: New session 3 of user zuul.
Jan 30 03:27:26 np0005601978 systemd[1]: Started Session 3 of User zuul.
Jan 30 03:27:26 np0005601978 python3[6980]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-3199-9bd3-000000000189-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:27:33 np0005601978 python3[7060]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:27:34 np0005601978 python3[7133]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769761653.4303339-155-194510886250182/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=e62b617a9316137df98c559865bda2bbd1afec9c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:27:34 np0005601978 python3[7183]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 03:27:34 np0005601978 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 30 03:27:34 np0005601978 systemd[1]: Stopped Network Manager Wait Online.
Jan 30 03:27:34 np0005601978 systemd[1]: Stopping Network Manager Wait Online...
Jan 30 03:27:34 np0005601978 systemd[1]: Stopping Network Manager...
Jan 30 03:27:34 np0005601978 NetworkManager[856]: <info>  [1769761654.5920] caught SIGTERM, shutting down normally.
Jan 30 03:27:34 np0005601978 NetworkManager[856]: <info>  [1769761654.5930] dhcp4 (eth0): canceled DHCP transaction
Jan 30 03:27:34 np0005601978 NetworkManager[856]: <info>  [1769761654.5931] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 30 03:27:34 np0005601978 NetworkManager[856]: <info>  [1769761654.5931] dhcp4 (eth0): state changed no lease
Jan 30 03:27:34 np0005601978 NetworkManager[856]: <info>  [1769761654.5933] manager: NetworkManager state is now CONNECTING
Jan 30 03:27:34 np0005601978 NetworkManager[856]: <info>  [1769761654.6085] dhcp4 (eth1): canceled DHCP transaction
Jan 30 03:27:34 np0005601978 NetworkManager[856]: <info>  [1769761654.6085] dhcp4 (eth1): state changed no lease
Jan 30 03:27:34 np0005601978 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 30 03:27:34 np0005601978 NetworkManager[856]: <info>  [1769761654.6132] exiting (success)
Jan 30 03:27:34 np0005601978 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 30 03:27:34 np0005601978 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 30 03:27:34 np0005601978 systemd[1]: Stopped Network Manager.
Jan 30 03:27:34 np0005601978 systemd[1]: NetworkManager.service: Consumed 1.314s CPU time, 10.2M memory peak.
Jan 30 03:27:34 np0005601978 systemd[1]: Starting Network Manager...
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.6482] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:85b9e2c8-8235-417d-81be-396eb2d5c232)
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.6485] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.6528] manager[0x563b06cc3000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 30 03:27:34 np0005601978 systemd[1]: Starting Hostname Service...
Jan 30 03:27:34 np0005601978 systemd[1]: Started Hostname Service.
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7447] hostname: hostname: using hostnamed
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7448] hostname: static hostname changed from (none) to "np0005601978.novalocal"
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7452] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7457] manager[0x563b06cc3000]: rfkill: Wi-Fi hardware radio set enabled
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7458] manager[0x563b06cc3000]: rfkill: WWAN hardware radio set enabled
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7483] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7483] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7484] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7484] manager: Networking is enabled by state file
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7488] settings: Loaded settings plugin: keyfile (internal)
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7491] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7519] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7530] dhcp: init: Using DHCP client 'internal'
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7532] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7538] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7545] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7554] device (lo): Activation: starting connection 'lo' (33b2f94f-fa01-4cf4-b364-8f6cc69c6981)
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7563] device (eth0): carrier: link connected
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7566] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7573] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7573] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7582] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7589] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7596] device (eth1): carrier: link connected
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7600] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7610] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (0fdac447-07b1-3cb9-a4f0-3c44589ad525) (indicated)
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7610] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7618] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7629] device (eth1): Activation: starting connection 'Wired connection 1' (0fdac447-07b1-3cb9-a4f0-3c44589ad525)
Jan 30 03:27:34 np0005601978 systemd[1]: Started Network Manager.
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7636] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7648] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7652] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7657] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7661] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7667] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7671] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7675] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7681] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7691] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7697] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7707] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7711] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7725] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7731] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7738] device (lo): Activation: successful, device activated.
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7748] dhcp4 (eth0): state changed new lease, address=38.102.83.136
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7756] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7822] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 30 03:27:34 np0005601978 systemd[1]: Starting Network Manager Wait Online...
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7867] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7869] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7875] manager: NetworkManager state is now CONNECTED_SITE
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7880] device (eth0): Activation: successful, device activated.
Jan 30 03:27:34 np0005601978 NetworkManager[7192]: <info>  [1769761654.7888] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 30 03:27:35 np0005601978 python3[7267]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-3199-9bd3-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:27:44 np0005601978 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 30 03:28:04 np0005601978 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.4915] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 30 03:28:20 np0005601978 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 30 03:28:20 np0005601978 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5206] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5210] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5217] device (eth1): Activation: successful, device activated.
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5226] manager: startup complete
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5228] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <warn>  [1769761700.5233] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5244] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 30 03:28:20 np0005601978 systemd[1]: Finished Network Manager Wait Online.
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5365] dhcp4 (eth1): canceled DHCP transaction
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5366] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5366] dhcp4 (eth1): state changed no lease
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5382] policy: auto-activating connection 'ci-private-network' (fcbae144-51cd-56e4-bfd6-6f248d2a14ce)
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5387] device (eth1): Activation: starting connection 'ci-private-network' (fcbae144-51cd-56e4-bfd6-6f248d2a14ce)
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5387] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5390] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5398] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5406] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5474] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5477] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 03:28:20 np0005601978 NetworkManager[7192]: <info>  [1769761700.5482] device (eth1): Activation: successful, device activated.
Jan 30 03:28:30 np0005601978 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 30 03:28:35 np0005601978 systemd-logind[793]: Session 3 logged out. Waiting for processes to exit.
Jan 30 03:28:35 np0005601978 systemd[1]: session-3.scope: Deactivated successfully.
Jan 30 03:28:35 np0005601978 systemd[1]: session-3.scope: Consumed 1.490s CPU time.
Jan 30 03:28:35 np0005601978 systemd-logind[793]: Removed session 3.
Jan 30 03:28:58 np0005601978 systemd-logind[793]: New session 4 of user zuul.
Jan 30 03:28:58 np0005601978 systemd[1]: Started Session 4 of User zuul.
Jan 30 03:28:59 np0005601978 python3[7376]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:28:59 np0005601978 python3[7449]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769761739.0918555-365-54170518210580/source _original_basename=tmpt36putjv follow=False checksum=5a54f5751b70d25169fc05ac686186710615a542 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:29:02 np0005601978 systemd[1]: session-4.scope: Deactivated successfully.
Jan 30 03:29:02 np0005601978 systemd-logind[793]: Session 4 logged out. Waiting for processes to exit.
Jan 30 03:29:02 np0005601978 systemd-logind[793]: Removed session 4.
Jan 30 03:31:02 np0005601978 systemd[4304]: Created slice User Background Tasks Slice.
Jan 30 03:31:02 np0005601978 systemd[4304]: Starting Cleanup of User's Temporary Files and Directories...
Jan 30 03:31:02 np0005601978 systemd[4304]: Finished Cleanup of User's Temporary Files and Directories.
Jan 30 03:36:32 np0005601978 systemd-logind[793]: New session 5 of user zuul.
Jan 30 03:36:32 np0005601978 systemd[1]: Started Session 5 of User zuul.
Jan 30 03:36:32 np0005601978 python3[7517]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-f7b8-4607-000000002181-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:36:34 np0005601978 python3[7546]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:36:34 np0005601978 python3[7572]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:36:34 np0005601978 python3[7598]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:36:35 np0005601978 python3[7624]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:36:35 np0005601978 python3[7650]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:36:36 np0005601978 python3[7728]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:36:36 np0005601978 python3[7801]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769762196.1465273-539-90806408804833/source _original_basename=tmp1t33ov5v follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:36:37 np0005601978 python3[7851]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 03:36:37 np0005601978 systemd[1]: Reloading.
Jan 30 03:36:38 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 03:36:38 np0005601978 systemd[1]: Starting dnf makecache...
Jan 30 03:36:38 np0005601978 dnf[7883]: Failed determining last makecache time.
Jan 30 03:36:38 np0005601978 dnf[7883]: CentOS Stream 9 - BaseOS                         38 kB/s | 6.1 kB     00:00
Jan 30 03:36:38 np0005601978 dnf[7883]: CentOS Stream 9 - AppStream                      63 kB/s | 6.2 kB     00:00
Jan 30 03:36:39 np0005601978 python3[7915]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 30 03:36:39 np0005601978 dnf[7883]: CentOS Stream 9 - CRB                            53 kB/s | 6.0 kB     00:00
Jan 30 03:36:39 np0005601978 dnf[7883]: CentOS Stream 9 - Extras packages                30 kB/s | 7.3 kB     00:00
Jan 30 03:36:39 np0005601978 python3[7943]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:36:40 np0005601978 dnf[7883]: Metadata cache created.
Jan 30 03:36:40 np0005601978 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 30 03:36:40 np0005601978 systemd[1]: Finished dnf makecache.
Jan 30 03:36:40 np0005601978 python3[7971]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:36:40 np0005601978 python3[7999]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:36:40 np0005601978 python3[8027]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:36:41 np0005601978 python3[8054]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-f7b8-4607-000000002188-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:36:42 np0005601978 python3[8084]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 30 03:36:44 np0005601978 systemd[1]: session-5.scope: Deactivated successfully.
Jan 30 03:36:44 np0005601978 systemd[1]: session-5.scope: Consumed 4.022s CPU time.
Jan 30 03:36:44 np0005601978 systemd-logind[793]: Session 5 logged out. Waiting for processes to exit.
Jan 30 03:36:44 np0005601978 systemd-logind[793]: Removed session 5.
Jan 30 03:36:46 np0005601978 systemd-logind[793]: New session 6 of user zuul.
Jan 30 03:36:46 np0005601978 systemd[1]: Started Session 6 of User zuul.
Jan 30 03:36:46 np0005601978 python3[8119]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 30 03:36:55 np0005601978 setsebool[8155]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 30 03:36:55 np0005601978 setsebool[8155]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 30 03:37:08 np0005601978 kernel: SELinux:  Converting 385 SID table entries...
Jan 30 03:37:08 np0005601978 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 03:37:08 np0005601978 kernel: SELinux:  policy capability open_perms=1
Jan 30 03:37:08 np0005601978 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 03:37:08 np0005601978 kernel: SELinux:  policy capability always_check_network=0
Jan 30 03:37:08 np0005601978 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 03:37:08 np0005601978 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 03:37:08 np0005601978 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 03:37:18 np0005601978 kernel: SELinux:  Converting 388 SID table entries...
Jan 30 03:37:18 np0005601978 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 03:37:18 np0005601978 kernel: SELinux:  policy capability open_perms=1
Jan 30 03:37:18 np0005601978 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 03:37:18 np0005601978 kernel: SELinux:  policy capability always_check_network=0
Jan 30 03:37:18 np0005601978 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 03:37:18 np0005601978 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 03:37:18 np0005601978 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 03:37:37 np0005601978 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 30 03:37:37 np0005601978 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 03:37:37 np0005601978 systemd[1]: Starting man-db-cache-update.service...
Jan 30 03:37:37 np0005601978 systemd[1]: Reloading.
Jan 30 03:37:37 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 03:37:37 np0005601978 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 03:37:39 np0005601978 python3[10207]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-08d8-eacc-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:37:40 np0005601978 kernel: evm: overlay not supported
Jan 30 03:37:41 np0005601978 systemd[4304]: Starting D-Bus User Message Bus...
Jan 30 03:37:41 np0005601978 dbus-broker-launch[11665]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 30 03:37:41 np0005601978 dbus-broker-launch[11665]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 30 03:37:41 np0005601978 systemd[4304]: Started D-Bus User Message Bus.
Jan 30 03:37:41 np0005601978 dbus-broker-lau[11665]: Ready
Jan 30 03:37:41 np0005601978 systemd[4304]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 30 03:37:41 np0005601978 systemd[4304]: Created slice Slice /user.
Jan 30 03:37:41 np0005601978 systemd[4304]: podman-11168.scope: unit configures an IP firewall, but not running as root.
Jan 30 03:37:41 np0005601978 systemd[4304]: (This warning is only shown for the first unit using IP firewalling.)
Jan 30 03:37:41 np0005601978 systemd[4304]: Started podman-11168.scope.
Jan 30 03:37:41 np0005601978 systemd[4304]: Started podman-pause-d3a94289.scope.
Jan 30 03:37:42 np0005601978 python3[12533]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.119:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.119:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:37:42 np0005601978 python3[12533]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 30 03:37:43 np0005601978 systemd-logind[793]: Session 6 logged out. Waiting for processes to exit.
Jan 30 03:37:43 np0005601978 systemd[1]: session-6.scope: Deactivated successfully.
Jan 30 03:37:43 np0005601978 systemd[1]: session-6.scope: Consumed 44.261s CPU time.
Jan 30 03:37:43 np0005601978 systemd-logind[793]: Removed session 6.
Jan 30 03:38:08 np0005601978 systemd-logind[793]: New session 7 of user zuul.
Jan 30 03:38:08 np0005601978 systemd[1]: Started Session 7 of User zuul.
Jan 30 03:38:08 np0005601978 python3[23101]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBF/Y7EpNCX4+jeTf6M3Q1EA4MMTh+cMR1J5eXc3jzvTTUqxCNTO2CaC+w+dR8pLzeBhqJOAlAXh7xbdLGTmEiGA= zuul@np0005601976.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:38:09 np0005601978 python3[23311]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBF/Y7EpNCX4+jeTf6M3Q1EA4MMTh+cMR1J5eXc3jzvTTUqxCNTO2CaC+w+dR8pLzeBhqJOAlAXh7xbdLGTmEiGA= zuul@np0005601976.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:38:09 np0005601978 python3[23677]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005601978.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 30 03:38:10 np0005601978 python3[24340]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBF/Y7EpNCX4+jeTf6M3Q1EA4MMTh+cMR1J5eXc3jzvTTUqxCNTO2CaC+w+dR8pLzeBhqJOAlAXh7xbdLGTmEiGA= zuul@np0005601976.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 30 03:38:11 np0005601978 python3[24737]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:38:11 np0005601978 python3[25107]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769762291.0871964-169-6000410475094/source _original_basename=tmphuo8y6an follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:38:12 np0005601978 python3[25524]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Jan 30 03:38:12 np0005601978 systemd[1]: Starting Hostname Service...
Jan 30 03:38:12 np0005601978 systemd[1]: Started Hostname Service.
Jan 30 03:38:12 np0005601978 systemd-hostnamed[25611]: Changed pretty hostname to 'compute-1'
Jan 30 03:38:12 np0005601978 systemd-hostnamed[25611]: Hostname set to <compute-1> (static)
Jan 30 03:38:12 np0005601978 NetworkManager[7192]: <info>  [1769762292.8512] hostname: static hostname changed from "np0005601978.novalocal" to "compute-1"
Jan 30 03:38:12 np0005601978 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 30 03:38:12 np0005601978 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 30 03:38:13 np0005601978 systemd[1]: session-7.scope: Deactivated successfully.
Jan 30 03:38:13 np0005601978 systemd[1]: session-7.scope: Consumed 1.933s CPU time.
Jan 30 03:38:13 np0005601978 systemd-logind[793]: Session 7 logged out. Waiting for processes to exit.
Jan 30 03:38:13 np0005601978 systemd-logind[793]: Removed session 7.
Jan 30 03:38:22 np0005601978 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 30 03:38:25 np0005601978 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 03:38:25 np0005601978 systemd[1]: Finished man-db-cache-update.service.
Jan 30 03:38:25 np0005601978 systemd[1]: man-db-cache-update.service: Consumed 43.479s CPU time.
Jan 30 03:38:25 np0005601978 systemd[1]: run-rd6e1a06145f44b299e9b70326dac2915.service: Deactivated successfully.
Jan 30 03:38:42 np0005601978 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 30 03:40:02 np0005601978 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 30 03:40:02 np0005601978 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 30 03:40:02 np0005601978 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 30 03:40:02 np0005601978 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 30 03:42:17 np0005601978 systemd-logind[793]: New session 8 of user zuul.
Jan 30 03:42:17 np0005601978 systemd[1]: Started Session 8 of User zuul.
Jan 30 03:42:17 np0005601978 python3[30083]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:42:19 np0005601978 python3[30199]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:42:19 np0005601978 python3[30272]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769762539.140314-34126-265865294951521/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:42:20 np0005601978 python3[30298]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:42:20 np0005601978 python3[30371]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769762539.140314-34126-265865294951521/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:42:20 np0005601978 python3[30397]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:42:21 np0005601978 python3[30470]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769762539.140314-34126-265865294951521/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:42:21 np0005601978 python3[30496]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:42:21 np0005601978 python3[30569]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769762539.140314-34126-265865294951521/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:42:21 np0005601978 python3[30595]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:42:22 np0005601978 python3[30668]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769762539.140314-34126-265865294951521/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:42:22 np0005601978 python3[30694]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:42:22 np0005601978 python3[30767]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769762539.140314-34126-265865294951521/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:42:22 np0005601978 python3[30793]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 30 03:42:22 np0005601978 python3[30866]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769762539.140314-34126-265865294951521/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:42:33 np0005601978 python3[30914]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:47:33 np0005601978 systemd[1]: session-8.scope: Deactivated successfully.
Jan 30 03:47:33 np0005601978 systemd[1]: session-8.scope: Consumed 4.244s CPU time.
Jan 30 03:47:33 np0005601978 systemd-logind[793]: Session 8 logged out. Waiting for processes to exit.
Jan 30 03:47:33 np0005601978 systemd-logind[793]: Removed session 8.
Jan 30 03:58:09 np0005601978 systemd-logind[793]: New session 9 of user zuul.
Jan 30 03:58:09 np0005601978 systemd[1]: Started Session 9 of User zuul.
Jan 30 03:58:10 np0005601978 python3.9[31076]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:58:11 np0005601978 python3.9[31257]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:58:19 np0005601978 systemd[1]: session-9.scope: Deactivated successfully.
Jan 30 03:58:19 np0005601978 systemd[1]: session-9.scope: Consumed 7.256s CPU time.
Jan 30 03:58:19 np0005601978 systemd-logind[793]: Session 9 logged out. Waiting for processes to exit.
Jan 30 03:58:19 np0005601978 systemd-logind[793]: Removed session 9.
Jan 30 03:58:24 np0005601978 systemd-logind[793]: New session 10 of user zuul.
Jan 30 03:58:25 np0005601978 systemd[1]: Started Session 10 of User zuul.
Jan 30 03:58:25 np0005601978 python3.9[31468]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:58:26 np0005601978 systemd[1]: session-10.scope: Deactivated successfully.
Jan 30 03:58:26 np0005601978 systemd-logind[793]: Session 10 logged out. Waiting for processes to exit.
Jan 30 03:58:26 np0005601978 systemd-logind[793]: Removed session 10.
Jan 30 03:58:42 np0005601978 systemd-logind[793]: New session 11 of user zuul.
Jan 30 03:58:42 np0005601978 systemd[1]: Started Session 11 of User zuul.
Jan 30 03:58:42 np0005601978 python3.9[31650]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 30 03:58:43 np0005601978 python3.9[31824]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:58:44 np0005601978 python3.9[31976]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 03:58:45 np0005601978 python3.9[32129]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 03:58:46 np0005601978 python3.9[32281]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:58:47 np0005601978 python3.9[32433]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 03:58:47 np0005601978 python3.9[32556]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763526.7475924-173-125412334531920/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:58:48 np0005601978 python3.9[32708]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:58:49 np0005601978 python3.9[32864]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 03:58:50 np0005601978 python3.9[33016]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 03:58:50 np0005601978 python3.9[33166]: ansible-ansible.builtin.service_facts Invoked
Jan 30 03:58:53 np0005601978 python3.9[33419]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 03:58:54 np0005601978 python3.9[33569]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:58:55 np0005601978 python3.9[33723]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 03:58:56 np0005601978 python3.9[33881]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 03:58:57 np0005601978 python3.9[33965]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 03:59:42 np0005601978 systemd[1]: Reloading.
Jan 30 03:59:42 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 03:59:42 np0005601978 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 30 03:59:42 np0005601978 systemd[1]: Reloading.
Jan 30 03:59:42 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 03:59:42 np0005601978 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 30 03:59:42 np0005601978 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 30 03:59:42 np0005601978 systemd[1]: Reloading.
Jan 30 03:59:42 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 03:59:43 np0005601978 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 30 03:59:43 np0005601978 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Jan 30 03:59:43 np0005601978 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Jan 30 03:59:43 np0005601978 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Jan 30 04:00:43 np0005601978 kernel: SELinux:  Converting 2726 SID table entries...
Jan 30 04:00:43 np0005601978 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 04:00:43 np0005601978 kernel: SELinux:  policy capability open_perms=1
Jan 30 04:00:43 np0005601978 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 04:00:43 np0005601978 kernel: SELinux:  policy capability always_check_network=0
Jan 30 04:00:43 np0005601978 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 04:00:43 np0005601978 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 04:00:43 np0005601978 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 04:00:43 np0005601978 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 30 04:00:43 np0005601978 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:00:44 np0005601978 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:00:44 np0005601978 systemd[1]: Reloading.
Jan 30 04:00:44 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:00:44 np0005601978 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:00:45 np0005601978 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:00:45 np0005601978 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:00:45 np0005601978 systemd[1]: man-db-cache-update.service: Consumed 1.108s CPU time.
Jan 30 04:00:45 np0005601978 systemd[1]: run-r29c6d3a315f84ff4bef145ab43f7cc80.service: Deactivated successfully.
Jan 30 04:00:45 np0005601978 python3.9[35478]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:00:47 np0005601978 python3.9[35759]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 30 04:00:48 np0005601978 python3.9[35911]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 30 04:00:51 np0005601978 python3.9[36064]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:00:53 np0005601978 python3.9[36216]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 30 04:00:55 np0005601978 python3.9[36368]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:01:00 np0005601978 python3.9[36522]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:01:01 np0005601978 python3.9[36645]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763660.3270152-662-40866107403969/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:01:02 np0005601978 python3.9[36812]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:01:03 np0005601978 python3.9[36964]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:03 np0005601978 python3.9[37117]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:01:05 np0005601978 python3.9[37269]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 30 04:01:05 np0005601978 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:01:05 np0005601978 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:01:05 np0005601978 python3.9[37423]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 30 04:01:06 np0005601978 python3.9[37581]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 30 04:01:07 np0005601978 python3.9[37741]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 30 04:01:08 np0005601978 python3.9[37894]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 30 04:01:09 np0005601978 python3.9[38052]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 30 04:01:10 np0005601978 python3.9[38204]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:01:12 np0005601978 python3.9[38357]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:01:13 np0005601978 python3.9[38509]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:01:13 np0005601978 python3.9[38632]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769763673.0135796-1019-38536116867865/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:01:15 np0005601978 python3.9[38784]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:01:15 np0005601978 systemd[1]: Starting Load Kernel Modules...
Jan 30 04:01:15 np0005601978 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 30 04:01:15 np0005601978 kernel: Bridge firewalling registered
Jan 30 04:01:15 np0005601978 systemd-modules-load[38788]: Inserted module 'br_netfilter'
Jan 30 04:01:15 np0005601978 systemd[1]: Finished Load Kernel Modules.
Jan 30 04:01:16 np0005601978 python3.9[38944]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:01:16 np0005601978 python3.9[39067]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769763675.6042688-1088-48516700402283/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:01:17 np0005601978 python3.9[39219]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:01:21 np0005601978 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Jan 30 04:01:21 np0005601978 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Jan 30 04:01:21 np0005601978 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:01:21 np0005601978 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:01:21 np0005601978 systemd[1]: Reloading.
Jan 30 04:01:21 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:01:21 np0005601978 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:01:24 np0005601978 python3.9[42665]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:01:25 np0005601978 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:01:25 np0005601978 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:01:25 np0005601978 systemd[1]: man-db-cache-update.service: Consumed 4.166s CPU time.
Jan 30 04:01:25 np0005601978 systemd[1]: run-rd7d533be940f4def874f18a1b6168c0c.service: Deactivated successfully.
Jan 30 04:01:25 np0005601978 python3.9[43152]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 30 04:01:26 np0005601978 python3.9[43302]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:01:26 np0005601978 python3.9[43454]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:27 np0005601978 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 30 04:01:27 np0005601978 systemd[1]: Starting Authorization Manager...
Jan 30 04:01:27 np0005601978 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 30 04:01:27 np0005601978 polkitd[43671]: Started polkitd version 0.117
Jan 30 04:01:27 np0005601978 systemd[1]: Started Authorization Manager.
Jan 30 04:01:28 np0005601978 python3.9[43841]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:01:28 np0005601978 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 30 04:01:28 np0005601978 systemd[1]: tuned.service: Deactivated successfully.
Jan 30 04:01:28 np0005601978 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 30 04:01:28 np0005601978 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 30 04:01:28 np0005601978 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 30 04:01:29 np0005601978 python3.9[44003]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 30 04:01:33 np0005601978 python3.9[44155]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:01:33 np0005601978 systemd[1]: Reloading.
Jan 30 04:01:33 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:01:34 np0005601978 python3.9[44344]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:01:34 np0005601978 systemd[1]: Reloading.
Jan 30 04:01:34 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:01:35 np0005601978 python3.9[44532]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:36 np0005601978 python3.9[44685]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:36 np0005601978 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 30 04:01:36 np0005601978 python3.9[44838]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:39 np0005601978 python3.9[45000]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:39 np0005601978 python3.9[45153]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:01:39 np0005601978 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 30 04:01:39 np0005601978 systemd[1]: Stopped Apply Kernel Variables.
Jan 30 04:01:39 np0005601978 systemd[1]: Stopping Apply Kernel Variables...
Jan 30 04:01:39 np0005601978 systemd[1]: Starting Apply Kernel Variables...
Jan 30 04:01:39 np0005601978 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 30 04:01:39 np0005601978 systemd[1]: Finished Apply Kernel Variables.
Jan 30 04:01:40 np0005601978 systemd[1]: session-11.scope: Deactivated successfully.
Jan 30 04:01:40 np0005601978 systemd[1]: session-11.scope: Consumed 2min 8.556s CPU time.
Jan 30 04:01:40 np0005601978 systemd-logind[793]: Session 11 logged out. Waiting for processes to exit.
Jan 30 04:01:40 np0005601978 systemd-logind[793]: Removed session 11.
Jan 30 04:01:45 np0005601978 systemd-logind[793]: New session 12 of user zuul.
Jan 30 04:01:45 np0005601978 systemd[1]: Started Session 12 of User zuul.
Jan 30 04:01:46 np0005601978 python3.9[45336]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:01:48 np0005601978 python3.9[45490]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:01:49 np0005601978 python3.9[45646]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:50 np0005601978 python3.9[45797]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:01:51 np0005601978 python3.9[45953]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:01:52 np0005601978 python3.9[46037]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:01:54 np0005601978 python3.9[46190]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:01:55 np0005601978 python3.9[46361]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:01:55 np0005601978 python3.9[46513]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:01:56 np0005601978 systemd[1]: var-lib-containers-storage-overlay-compat1405510760-merged.mount: Deactivated successfully.
Jan 30 04:01:56 np0005601978 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck821639356-merged.mount: Deactivated successfully.
Jan 30 04:01:56 np0005601978 podman[46514]: 2026-01-30 09:01:56.240606868 +0000 UTC m=+0.327850115 system refresh
Jan 30 04:01:57 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:01:57 np0005601978 python3.9[46676]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:01:57 np0005601978 python3.9[46799]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763716.491931-283-276682769179479/.source.json follow=False _original_basename=podman_network_config.j2 checksum=913b56fbc64fbcc719042fbf115600f429b3c476 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:01:58 np0005601978 python3.9[46951]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:01:58 np0005601978 python3.9[47074]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769763717.885557-328-244099261720171/.source.conf follow=False _original_basename=registries.conf.j2 checksum=4891ae8372aa80a8fa92515759173ef122bd9c5c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:01:59 np0005601978 python3.9[47226]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:02:00 np0005601978 python3.9[47378]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:02:00 np0005601978 python3.9[47530]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:02:01 np0005601978 python3.9[47682]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:02:02 np0005601978 python3.9[47832]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:02:03 np0005601978 python3.9[47986]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:05 np0005601978 python3.9[48139]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:07 np0005601978 python3.9[48300]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:10 np0005601978 python3.9[48453]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:12 np0005601978 python3.9[48606]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:15 np0005601978 python3.9[48762]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:18 np0005601978 python3.9[48932]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:20 np0005601978 python3.9[49085]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:32 np0005601978 python3.9[49420]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:34 np0005601978 python3.9[49576]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:02:37 np0005601978 python3.9[49733]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:02:37 np0005601978 python3.9[49908]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:02:38 np0005601978 python3.9[50031]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769763757.516158-802-187035506829368/.source.json _original_basename=.mn35i3j3 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:02:39 np0005601978 python3.9[50183]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 30 04:02:39 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:02:42 np0005601978 systemd[1]: var-lib-containers-storage-overlay-compat1710184471-lower\x2dmapped.mount: Deactivated successfully.
Jan 30 04:02:45 np0005601978 podman[50196]: 2026-01-30 09:02:45.866518477 +0000 UTC m=+6.330990402 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 30 04:02:45 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:02:45 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:02:45 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:02:46 np0005601978 python3.9[50492]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 30 04:02:59 np0005601978 podman[50504]: 2026-01-30 09:02:59.842606804 +0000 UTC m=+12.972634367 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:02:59 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:02:59 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:02:59 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:00 np0005601978 python3.9[50805]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 30 04:03:13 np0005601978 podman[50818]: 2026-01-30 09:03:13.854873731 +0000 UTC m=+13.065805556 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 30 04:03:13 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:13 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:13 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:15 np0005601978 python3.9[51097]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 30 04:03:15 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:21 np0005601978 podman[51109]: 2026-01-30 09:03:21.030161088 +0000 UTC m=+5.455089046 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 30 04:03:21 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:21 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:21 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:21 np0005601978 python3.9[51365]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 30 04:03:23 np0005601978 podman[51377]: 2026-01-30 09:03:23.304236108 +0000 UTC m=+1.583023202 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 30 04:03:23 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:23 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:23 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:03:24 np0005601978 systemd[1]: session-12.scope: Deactivated successfully.
Jan 30 04:03:24 np0005601978 systemd[1]: session-12.scope: Consumed 1min 39.439s CPU time.
Jan 30 04:03:24 np0005601978 systemd-logind[793]: Session 12 logged out. Waiting for processes to exit.
Jan 30 04:03:24 np0005601978 systemd-logind[793]: Removed session 12.
Jan 30 04:03:30 np0005601978 systemd-logind[793]: New session 13 of user zuul.
Jan 30 04:03:30 np0005601978 systemd[1]: Started Session 13 of User zuul.
Jan 30 04:03:31 np0005601978 python3.9[51677]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:03:32 np0005601978 python3.9[51833]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 30 04:03:32 np0005601978 python3.9[51986]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 30 04:03:33 np0005601978 python3.9[52144]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 30 04:03:34 np0005601978 python3.9[52304]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:03:35 np0005601978 python3.9[52388]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:03:38 np0005601978 python3.9[52549]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:03:52 np0005601978 kernel: SELinux:  Converting 2739 SID table entries...
Jan 30 04:03:52 np0005601978 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 04:03:52 np0005601978 kernel: SELinux:  policy capability open_perms=1
Jan 30 04:03:52 np0005601978 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 04:03:52 np0005601978 kernel: SELinux:  policy capability always_check_network=0
Jan 30 04:03:52 np0005601978 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 04:03:52 np0005601978 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 04:03:52 np0005601978 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 04:03:52 np0005601978 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 30 04:03:52 np0005601978 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 30 04:03:53 np0005601978 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:03:53 np0005601978 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:03:53 np0005601978 systemd[1]: Reloading.
Jan 30 04:03:53 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:03:53 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:03:53 np0005601978 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:03:53 np0005601978 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:03:53 np0005601978 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:03:53 np0005601978 systemd[1]: run-r5fd9d73d5317480c93d484e0ec6adf5c.service: Deactivated successfully.
Jan 30 04:03:55 np0005601978 python3.9[53647]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:03:55 np0005601978 systemd[1]: Reloading.
Jan 30 04:03:55 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:03:55 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:03:55 np0005601978 systemd[1]: Starting Open vSwitch Database Unit...
Jan 30 04:03:55 np0005601978 chown[53689]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 30 04:03:55 np0005601978 ovs-ctl[53694]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 30 04:03:55 np0005601978 ovs-ctl[53694]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 30 04:03:55 np0005601978 ovs-ctl[53694]: Starting ovsdb-server [  OK  ]
Jan 30 04:03:55 np0005601978 ovs-vsctl[53743]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 30 04:03:55 np0005601978 ovs-vsctl[53763]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"9803b804-d88a-4443-b777-6ecddbb75ed8\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 30 04:03:55 np0005601978 ovs-ctl[53694]: Configuring Open vSwitch system IDs [  OK  ]
Jan 30 04:03:55 np0005601978 ovs-ctl[53694]: Enabling remote OVSDB managers [  OK  ]
Jan 30 04:03:55 np0005601978 ovs-vsctl[53769]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 30 04:03:55 np0005601978 systemd[1]: Started Open vSwitch Database Unit.
Jan 30 04:03:55 np0005601978 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 30 04:03:55 np0005601978 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 30 04:03:55 np0005601978 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 30 04:03:55 np0005601978 kernel: openvswitch: Open vSwitch switching datapath
Jan 30 04:03:55 np0005601978 ovs-ctl[53814]: Inserting openvswitch module [  OK  ]
Jan 30 04:03:55 np0005601978 ovs-ctl[53783]: Starting ovs-vswitchd [  OK  ]
Jan 30 04:03:55 np0005601978 ovs-vsctl[53832]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 30 04:03:55 np0005601978 ovs-ctl[53783]: Enabling remote OVSDB managers [  OK  ]
Jan 30 04:03:55 np0005601978 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 30 04:03:55 np0005601978 systemd[1]: Starting Open vSwitch...
Jan 30 04:03:55 np0005601978 systemd[1]: Finished Open vSwitch.
Jan 30 04:03:56 np0005601978 python3.9[53983]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:03:57 np0005601978 python3.9[54135]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 30 04:03:59 np0005601978 kernel: SELinux:  Converting 2753 SID table entries...
Jan 30 04:03:59 np0005601978 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 04:03:59 np0005601978 kernel: SELinux:  policy capability open_perms=1
Jan 30 04:03:59 np0005601978 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 04:03:59 np0005601978 kernel: SELinux:  policy capability always_check_network=0
Jan 30 04:03:59 np0005601978 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 04:03:59 np0005601978 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 04:03:59 np0005601978 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 04:04:00 np0005601978 python3.9[54290]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:04:01 np0005601978 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 30 04:04:01 np0005601978 python3.9[54448]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:04:03 np0005601978 python3.9[54601]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:04:05 np0005601978 python3.9[54888]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 30 04:04:06 np0005601978 python3.9[55038]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:04:06 np0005601978 python3.9[55192]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:04:08 np0005601978 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:04:08 np0005601978 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:04:08 np0005601978 systemd[1]: Reloading.
Jan 30 04:04:08 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:04:08 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:04:08 np0005601978 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:04:08 np0005601978 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:04:08 np0005601978 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:04:08 np0005601978 systemd[1]: run-rd7650f6be128454694126854a985105c.service: Deactivated successfully.
Jan 30 04:04:10 np0005601978 python3.9[55510]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:04:10 np0005601978 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 30 04:04:10 np0005601978 systemd[1]: Stopped Network Manager Wait Online.
Jan 30 04:04:10 np0005601978 systemd[1]: Stopping Network Manager Wait Online...
Jan 30 04:04:10 np0005601978 systemd[1]: Stopping Network Manager...
Jan 30 04:04:10 np0005601978 NetworkManager[7192]: <info>  [1769763850.2461] caught SIGTERM, shutting down normally.
Jan 30 04:04:10 np0005601978 NetworkManager[7192]: <info>  [1769763850.2479] dhcp4 (eth0): canceled DHCP transaction
Jan 30 04:04:10 np0005601978 NetworkManager[7192]: <info>  [1769763850.2479] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 30 04:04:10 np0005601978 NetworkManager[7192]: <info>  [1769763850.2479] dhcp4 (eth0): state changed no lease
Jan 30 04:04:10 np0005601978 NetworkManager[7192]: <info>  [1769763850.2485] manager: NetworkManager state is now CONNECTED_SITE
Jan 30 04:04:10 np0005601978 NetworkManager[7192]: <info>  [1769763850.2560] exiting (success)
Jan 30 04:04:10 np0005601978 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 30 04:04:10 np0005601978 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 30 04:04:10 np0005601978 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 30 04:04:10 np0005601978 systemd[1]: Stopped Network Manager.
Jan 30 04:04:10 np0005601978 systemd[1]: NetworkManager.service: Consumed 14.506s CPU time, 4.1M memory peak, read 0B from disk, written 21.0K to disk.
Jan 30 04:04:10 np0005601978 systemd[1]: Starting Network Manager...
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.3522] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:85b9e2c8-8235-417d-81be-396eb2d5c232)
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.3525] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.3583] manager[0x560d732ce000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 30 04:04:10 np0005601978 systemd[1]: Starting Hostname Service...
Jan 30 04:04:10 np0005601978 systemd[1]: Started Hostname Service.
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4357] hostname: hostname: using hostnamed
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4359] hostname: static hostname changed from (none) to "compute-1"
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4365] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4370] manager[0x560d732ce000]: rfkill: Wi-Fi hardware radio set enabled
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4370] manager[0x560d732ce000]: rfkill: WWAN hardware radio set enabled
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4394] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4405] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4406] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4407] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4408] manager: Networking is enabled by state file
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4410] settings: Loaded settings plugin: keyfile (internal)
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4415] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4449] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4459] dhcp: init: Using DHCP client 'internal'
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4462] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4468] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4475] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4483] device (lo): Activation: starting connection 'lo' (33b2f94f-fa01-4cf4-b364-8f6cc69c6981)
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4490] device (eth0): carrier: link connected
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4495] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4500] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4501] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4508] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4518] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4526] device (eth1): carrier: link connected
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4530] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4536] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (fcbae144-51cd-56e4-bfd6-6f248d2a14ce) (indicated)
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4536] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4541] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4548] device (eth1): Activation: starting connection 'ci-private-network' (fcbae144-51cd-56e4-bfd6-6f248d2a14ce)
Jan 30 04:04:10 np0005601978 systemd[1]: Started Network Manager.
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4555] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4562] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4565] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4566] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4568] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4571] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4574] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4587] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4591] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4597] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4600] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4606] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4615] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4627] dhcp4 (eth0): state changed new lease, address=38.102.83.136
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4630] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4635] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4702] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 30 04:04:10 np0005601978 systemd[1]: Starting Network Manager Wait Online...
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4709] device (lo): Activation: successful, device activated.
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4717] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4725] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4727] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4731] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4734] device (eth1): Activation: successful, device activated.
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4744] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4746] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4748] manager: NetworkManager state is now CONNECTED_SITE
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4750] device (eth0): Activation: successful, device activated.
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4755] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 30 04:04:10 np0005601978 NetworkManager[55525]: <info>  [1769763850.4758] manager: startup complete
Jan 30 04:04:10 np0005601978 systemd[1]: Finished Network Manager Wait Online.
Jan 30 04:04:11 np0005601978 python3.9[55737]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:04:15 np0005601978 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:04:15 np0005601978 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:04:15 np0005601978 systemd[1]: Reloading.
Jan 30 04:04:15 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:04:15 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:04:16 np0005601978 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:04:16 np0005601978 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:04:16 np0005601978 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:04:16 np0005601978 systemd[1]: run-r787f06ccec6e4e26a71551684d55a009.service: Deactivated successfully.
Jan 30 04:04:20 np0005601978 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 30 04:04:20 np0005601978 python3.9[56196]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:04:21 np0005601978 python3.9[56348]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:22 np0005601978 python3.9[56502]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:23 np0005601978 python3.9[56654]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:23 np0005601978 python3.9[56806]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:24 np0005601978 python3.9[56958]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:25 np0005601978 python3.9[57110]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:04:25 np0005601978 python3.9[57233]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763864.6556234-643-139706043608951/.source _original_basename=.gvmshroe follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:26 np0005601978 python3.9[57385]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:27 np0005601978 python3.9[57537]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 30 04:04:27 np0005601978 python3.9[57689]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:30 np0005601978 python3.9[58116]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 30 04:04:31 np0005601978 ansible-async_wrapper.py[58291]: Invoked with j896792223725 300 /home/zuul/.ansible/tmp/ansible-tmp-1769763870.3361647-841-209176051537471/AnsiballZ_edpm_os_net_config.py _
Jan 30 04:04:31 np0005601978 ansible-async_wrapper.py[58294]: Starting module and watcher
Jan 30 04:04:31 np0005601978 ansible-async_wrapper.py[58294]: Start watching 58295 (300)
Jan 30 04:04:31 np0005601978 ansible-async_wrapper.py[58295]: Start module (58295)
Jan 30 04:04:31 np0005601978 ansible-async_wrapper.py[58291]: Return async_wrapper task started.
Jan 30 04:04:31 np0005601978 python3.9[58296]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 30 04:04:32 np0005601978 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 30 04:04:32 np0005601978 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 30 04:04:32 np0005601978 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 30 04:04:32 np0005601978 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 30 04:04:32 np0005601978 kernel: cfg80211: failed to load regulatory.db
Jan 30 04:04:32 np0005601978 NetworkManager[55525]: <info>  [1769763872.9682] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58297 uid=0 result="success"
Jan 30 04:04:32 np0005601978 NetworkManager[55525]: <info>  [1769763872.9703] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0141] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0142] audit: op="connection-add" uuid="2d7e8ef8-4dc4-445e-b174-26f74c5086ad" name="br-ex-br" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0162] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0163] audit: op="connection-add" uuid="06f1c339-6b2b-426c-814f-c828cc208d8b" name="br-ex-port" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0177] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0179] audit: op="connection-add" uuid="70ce8507-d301-4655-9681-eab017a362b1" name="eth1-port" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0191] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0192] audit: op="connection-add" uuid="cd03775e-e9e7-4b88-8d70-561e70b45053" name="vlan20-port" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0206] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0208] audit: op="connection-add" uuid="8b5eaba7-2696-47bb-a041-ecc34eed5156" name="vlan21-port" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0221] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0222] audit: op="connection-add" uuid="ee449963-ce5b-4d48-be71-a3a0c3ff501d" name="vlan22-port" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0241] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0257] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0258] audit: op="connection-add" uuid="c2fa3927-c00c-480d-bb24-b02f85dd3feb" name="br-ex-if" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0314] audit: op="connection-update" uuid="fcbae144-51cd-56e4-bfd6-6f248d2a14ce" name="ci-private-network" args="connection.port-type,connection.timestamp,connection.slave-type,connection.controller,connection.master,ovs-external-ids.data,ipv4.routing-rules,ipv4.dns,ipv4.method,ipv4.addresses,ipv4.never-default,ipv4.routes,ipv6.routing-rules,ipv6.addr-gen-mode,ipv6.dns,ipv6.method,ipv6.addresses,ipv6.routes,ovs-interface.type" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0329] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0330] audit: op="connection-add" uuid="cb399f0d-9df4-4dda-84a5-500d2406e2e9" name="vlan20-if" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0347] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0348] audit: op="connection-add" uuid="f6424c2c-12fa-41c0-bf14-e748692ed057" name="vlan21-if" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0365] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0367] audit: op="connection-add" uuid="dc3ce8e3-bd96-4b72-9d1c-afc8e397e96a" name="vlan22-if" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0381] audit: op="connection-delete" uuid="0fdac447-07b1-3cb9-a4f0-3c44589ad525" name="Wired connection 1" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0392] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <warn>  [1769763873.0395] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0401] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0405] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (2d7e8ef8-4dc4-445e-b174-26f74c5086ad)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0405] audit: op="connection-activate" uuid="2d7e8ef8-4dc4-445e-b174-26f74c5086ad" name="br-ex-br" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0406] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <warn>  [1769763873.0407] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0411] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0415] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (06f1c339-6b2b-426c-814f-c828cc208d8b)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0416] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <warn>  [1769763873.0417] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0420] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0423] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (70ce8507-d301-4655-9681-eab017a362b1)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0425] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <warn>  [1769763873.0425] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0429] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0432] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (cd03775e-e9e7-4b88-8d70-561e70b45053)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0434] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <warn>  [1769763873.0436] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0441] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0445] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (8b5eaba7-2696-47bb-a041-ecc34eed5156)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0448] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <warn>  [1769763873.0448] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0452] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0455] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (ee449963-ce5b-4d48-be71-a3a0c3ff501d)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0456] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0458] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0459] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0465] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <warn>  [1769763873.0466] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0469] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0472] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (c2fa3927-c00c-480d-bb24-b02f85dd3feb)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0472] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0475] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0476] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0477] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0479] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0488] device (eth1): disconnecting for new activation request.
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0488] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0490] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0492] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0493] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0496] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <warn>  [1769763873.0496] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0498] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0502] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (cb399f0d-9df4-4dda-84a5-500d2406e2e9)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0502] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0505] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0506] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0507] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0509] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <warn>  [1769763873.0509] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0511] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0515] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (f6424c2c-12fa-41c0-bf14-e748692ed057)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0515] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0518] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0519] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0520] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0522] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <warn>  [1769763873.0522] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0524] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0528] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (dc3ce8e3-bd96-4b72-9d1c-afc8e397e96a)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0528] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0530] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0531] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0532] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0533] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0548] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.addr-gen-mode,ipv6.method" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0549] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0551] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0552] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0558] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0561] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0563] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0566] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0567] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 kernel: ovs-system: entered promiscuous mode
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0572] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0588] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 kernel: Timeout policy base is empty
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0596] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0599] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 systemd-udevd[58303]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0611] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0620] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0626] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0630] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0639] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0647] dhcp4 (eth0): canceled DHCP transaction
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0648] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0648] dhcp4 (eth0): state changed no lease
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0650] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0666] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0671] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58297 uid=0 result="fail" reason="Device is not activated"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0677] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0690] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 30 04:04:33 np0005601978 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0699] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0707] device (eth1): disconnecting for new activation request.
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0709] audit: op="connection-activate" uuid="fcbae144-51cd-56e4-bfd6-6f248d2a14ce" name="ci-private-network" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0766] dhcp4 (eth0): state changed new lease, address=38.102.83.136
Jan 30 04:04:33 np0005601978 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0851] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58297 uid=0 result="success"
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0859] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 30 04:04:33 np0005601978 kernel: br-ex: entered promiscuous mode
Jan 30 04:04:33 np0005601978 kernel: vlan22: entered promiscuous mode
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0954] device (eth1): Activation: starting connection 'ci-private-network' (fcbae144-51cd-56e4-bfd6-6f248d2a14ce)
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0962] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 systemd-udevd[58301]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0964] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0966] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0967] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0969] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0972] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0983] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.0990] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1000] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1008] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601978 kernel: vlan21: entered promiscuous mode
Jan 30 04:04:33 np0005601978 systemd-udevd[58302]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1016] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1022] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1029] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1035] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1040] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1047] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1054] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1060] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1068] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1075] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601978 kernel: vlan20: entered promiscuous mode
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1103] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1122] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 30 04:04:33 np0005601978 systemd-udevd[58394]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1141] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 30 04:04:33 np0005601978 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1164] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1179] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1196] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1197] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1215] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1222] device (eth1): Activation: successful, device activated.
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1237] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1262] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1268] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1288] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1296] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1309] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1317] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1328] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1343] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1348] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1354] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1359] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1368] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1371] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1415] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1417] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 30 04:04:33 np0005601978 NetworkManager[55525]: <info>  [1769763873.1421] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 30 04:04:34 np0005601978 NetworkManager[55525]: <info>  [1769763874.2690] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58297 uid=0 result="success"
Jan 30 04:04:34 np0005601978 NetworkManager[55525]: <info>  [1769763874.4910] checkpoint[0x560d732a4950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 30 04:04:34 np0005601978 NetworkManager[55525]: <info>  [1769763874.4914] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58297 uid=0 result="success"
Jan 30 04:04:34 np0005601978 NetworkManager[55525]: <info>  [1769763874.9144] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58297 uid=0 result="success"
Jan 30 04:04:34 np0005601978 NetworkManager[55525]: <info>  [1769763874.9158] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58297 uid=0 result="success"
Jan 30 04:04:34 np0005601978 python3.9[58632]: ansible-ansible.legacy.async_status Invoked with jid=j896792223725.58291 mode=status _async_dir=/root/.ansible_async
Jan 30 04:04:35 np0005601978 NetworkManager[55525]: <info>  [1769763875.1499] audit: op="networking-control" arg="global-dns-configuration" pid=58297 uid=0 result="success"
Jan 30 04:04:35 np0005601978 NetworkManager[55525]: <info>  [1769763875.1539] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 30 04:04:35 np0005601978 NetworkManager[55525]: <info>  [1769763875.1586] audit: op="networking-control" arg="global-dns-configuration" pid=58297 uid=0 result="success"
Jan 30 04:04:35 np0005601978 NetworkManager[55525]: <info>  [1769763875.1618] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58297 uid=0 result="success"
Jan 30 04:04:35 np0005601978 NetworkManager[55525]: <info>  [1769763875.3057] checkpoint[0x560d732a4a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 30 04:04:35 np0005601978 NetworkManager[55525]: <info>  [1769763875.3061] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58297 uid=0 result="success"
Jan 30 04:04:35 np0005601978 ansible-async_wrapper.py[58295]: Module complete (58295)
Jan 30 04:04:36 np0005601978 ansible-async_wrapper.py[58294]: Done in kid B.
Jan 30 04:04:38 np0005601978 python3.9[58738]: ansible-ansible.legacy.async_status Invoked with jid=j896792223725.58291 mode=status _async_dir=/root/.ansible_async
Jan 30 04:04:39 np0005601978 python3.9[58837]: ansible-ansible.legacy.async_status Invoked with jid=j896792223725.58291 mode=cleanup _async_dir=/root/.ansible_async
Jan 30 04:04:39 np0005601978 python3.9[58989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:04:40 np0005601978 python3.9[59112]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763879.406175-922-17045776260475/.source.returncode _original_basename=.6jhr7ays follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:40 np0005601978 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 30 04:04:40 np0005601978 python3.9[59266]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:04:41 np0005601978 python3.9[59389]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763880.5928495-970-229049283767861/.source.cfg _original_basename=.e6hyo5qi follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:04:42 np0005601978 python3.9[59542]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:04:42 np0005601978 systemd[1]: Reloading Network Manager...
Jan 30 04:04:42 np0005601978 NetworkManager[55525]: <info>  [1769763882.3947] audit: op="reload" arg="0" pid=59546 uid=0 result="success"
Jan 30 04:04:42 np0005601978 NetworkManager[55525]: <info>  [1769763882.3954] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 30 04:04:42 np0005601978 systemd[1]: Reloaded Network Manager.
Jan 30 04:04:42 np0005601978 systemd[1]: session-13.scope: Deactivated successfully.
Jan 30 04:04:42 np0005601978 systemd[1]: session-13.scope: Consumed 47.378s CPU time.
Jan 30 04:04:42 np0005601978 systemd-logind[793]: Session 13 logged out. Waiting for processes to exit.
Jan 30 04:04:42 np0005601978 systemd-logind[793]: Removed session 13.
Jan 30 04:04:51 np0005601978 systemd-logind[793]: New session 14 of user zuul.
Jan 30 04:04:51 np0005601978 systemd[1]: Started Session 14 of User zuul.
Jan 30 04:04:52 np0005601978 python3.9[59731]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:04:52 np0005601978 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 30 04:04:53 np0005601978 python3.9[59886]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:04:54 np0005601978 python3.9[60075]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:04:55 np0005601978 systemd[1]: session-14.scope: Deactivated successfully.
Jan 30 04:04:55 np0005601978 systemd[1]: session-14.scope: Consumed 1.985s CPU time.
Jan 30 04:04:55 np0005601978 systemd-logind[793]: Session 14 logged out. Waiting for processes to exit.
Jan 30 04:04:55 np0005601978 systemd-logind[793]: Removed session 14.
Jan 30 04:05:00 np0005601978 systemd-logind[793]: New session 15 of user zuul.
Jan 30 04:05:00 np0005601978 systemd[1]: Started Session 15 of User zuul.
Jan 30 04:05:01 np0005601978 python3.9[60257]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:05:02 np0005601978 python3.9[60411]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:05:03 np0005601978 python3.9[60567]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:05:04 np0005601978 python3.9[60652]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:05:06 np0005601978 python3.9[60805]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:05:07 np0005601978 python3.9[60998]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:08 np0005601978 python3.9[61150]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:05:08 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:05:09 np0005601978 python3.9[61313]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:09 np0005601978 python3.9[61391]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:10 np0005601978 python3.9[61543]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:10 np0005601978 python3.9[61621]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:05:11 np0005601978 python3.9[61773]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:05:12 np0005601978 python3.9[61925]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:05:12 np0005601978 python3.9[62077]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:05:13 np0005601978 python3.9[62229]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:05:14 np0005601978 python3.9[62381]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:05:16 np0005601978 python3.9[62534]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:05:17 np0005601978 python3.9[62688]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:05:18 np0005601978 python3.9[62840]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:05:19 np0005601978 python3.9[62992]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:05:20 np0005601978 python3.9[63145]: ansible-service_facts Invoked
Jan 30 04:05:20 np0005601978 network[63162]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 30 04:05:20 np0005601978 network[63163]: 'network-scripts' will be removed from distribution in near future.
Jan 30 04:05:20 np0005601978 network[63164]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 30 04:05:26 np0005601978 python3.9[63616]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:05:29 np0005601978 python3.9[63769]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 30 04:05:30 np0005601978 python3.9[63921]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:31 np0005601978 python3.9[64046]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763929.9473732-652-112250241392762/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:31 np0005601978 python3.9[64200]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:32 np0005601978 python3.9[64325]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763931.3749387-698-14821536176538/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:34 np0005601978 python3.9[64479]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:35 np0005601978 python3.9[64633]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:05:36 np0005601978 python3.9[64717]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:05:45 np0005601978 python3.9[64871]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:05:45 np0005601978 python3.9[64955]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:05:45 np0005601978 chronyd[799]: chronyd exiting
Jan 30 04:05:45 np0005601978 systemd[1]: Stopping NTP client/server...
Jan 30 04:05:45 np0005601978 systemd[1]: chronyd.service: Deactivated successfully.
Jan 30 04:05:45 np0005601978 systemd[1]: Stopped NTP client/server.
Jan 30 04:05:45 np0005601978 systemd[1]: Starting NTP client/server...
Jan 30 04:05:45 np0005601978 chronyd[64963]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 30 04:05:45 np0005601978 chronyd[64963]: Frequency -23.603 +/- 0.443 ppm read from /var/lib/chrony/drift
Jan 30 04:05:45 np0005601978 chronyd[64963]: Loaded seccomp filter (level 2)
Jan 30 04:05:45 np0005601978 systemd[1]: Started NTP client/server.
Jan 30 04:05:46 np0005601978 systemd[1]: session-15.scope: Deactivated successfully.
Jan 30 04:05:46 np0005601978 systemd[1]: session-15.scope: Consumed 21.640s CPU time.
Jan 30 04:05:46 np0005601978 systemd-logind[793]: Session 15 logged out. Waiting for processes to exit.
Jan 30 04:05:46 np0005601978 systemd-logind[793]: Removed session 15.
Jan 30 04:05:51 np0005601978 systemd-logind[793]: New session 16 of user zuul.
Jan 30 04:05:51 np0005601978 systemd[1]: Started Session 16 of User zuul.
Jan 30 04:05:52 np0005601978 python3.9[65142]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:05:53 np0005601978 python3.9[65298]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:54 np0005601978 python3.9[65473]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:54 np0005601978 python3.9[65551]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.hisstw5r recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:55 np0005601978 python3.9[65703]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:56 np0005601978 python3.9[65826]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763955.5362709-139-236458074460256/.source _original_basename=.x8flsy0q follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:05:57 np0005601978 python3.9[65978]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:05:57 np0005601978 python3.9[66130]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:58 np0005601978 python3.9[66253]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769763957.5143547-211-21974739767224/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:05:58 np0005601978 python3.9[66405]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:05:59 np0005601978 python3.9[66528]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769763958.5321598-211-198372140728224/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:06:00 np0005601978 python3.9[66680]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:00 np0005601978 python3.9[66832]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:01 np0005601978 python3.9[66955]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763960.3276527-322-160580812367413/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:01 np0005601978 python3.9[67107]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:02 np0005601978 python3.9[67230]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763961.4229374-367-83317682990249/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:03 np0005601978 python3.9[67382]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:06:03 np0005601978 systemd[1]: Reloading.
Jan 30 04:06:03 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:06:03 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:06:03 np0005601978 systemd[1]: Reloading.
Jan 30 04:06:03 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:06:03 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:06:03 np0005601978 systemd[1]: Starting EDPM Container Shutdown...
Jan 30 04:06:03 np0005601978 systemd[1]: Finished EDPM Container Shutdown.
Jan 30 04:06:04 np0005601978 python3.9[67610]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:04 np0005601978 python3.9[67733]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763964.0823538-436-23009165098723/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:05 np0005601978 python3.9[67885]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:06 np0005601978 python3.9[68008]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763965.2211764-481-48113699734108/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:06 np0005601978 python3.9[68160]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:06:06 np0005601978 systemd[1]: Reloading.
Jan 30 04:06:06 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:06:06 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:06:07 np0005601978 systemd[1]: Reloading.
Jan 30 04:06:07 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:06:07 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:06:07 np0005601978 systemd[1]: Starting Create netns directory...
Jan 30 04:06:07 np0005601978 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 30 04:06:07 np0005601978 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 30 04:06:07 np0005601978 systemd[1]: Finished Create netns directory.
Jan 30 04:06:08 np0005601978 python3.9[68386]: ansible-ansible.builtin.service_facts Invoked
Jan 30 04:06:08 np0005601978 network[68403]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 30 04:06:08 np0005601978 network[68404]: 'network-scripts' will be removed from distribution in near future.
Jan 30 04:06:08 np0005601978 network[68405]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 30 04:06:11 np0005601978 python3.9[68667]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:06:12 np0005601978 systemd[1]: Reloading.
Jan 30 04:06:12 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:06:12 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:06:12 np0005601978 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 30 04:06:12 np0005601978 iptables.init[68707]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 30 04:06:12 np0005601978 iptables.init[68707]: iptables: Flushing firewall rules: [  OK  ]
Jan 30 04:06:12 np0005601978 systemd[1]: iptables.service: Deactivated successfully.
Jan 30 04:06:12 np0005601978 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 30 04:06:13 np0005601978 python3.9[68903]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:06:14 np0005601978 python3.9[69057]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:06:14 np0005601978 systemd[1]: Reloading.
Jan 30 04:06:14 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:06:14 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:06:14 np0005601978 systemd[1]: Starting Netfilter Tables...
Jan 30 04:06:14 np0005601978 systemd[1]: Finished Netfilter Tables.
Jan 30 04:06:15 np0005601978 python3.9[69250]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:06:16 np0005601978 python3.9[69403]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:16 np0005601978 python3.9[69528]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763975.663749-689-266839920715935/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:18 np0005601978 python3.9[69681]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:06:18 np0005601978 systemd[1]: Reloading OpenSSH server daemon...
Jan 30 04:06:18 np0005601978 systemd[1]: Reloaded OpenSSH server daemon.
Jan 30 04:06:18 np0005601978 python3.9[69837]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:19 np0005601978 python3.9[69989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:19 np0005601978 python3.9[70112]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763979.0407217-781-77614113760878/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:20 np0005601978 python3.9[70264]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 30 04:06:21 np0005601978 systemd[1]: Starting Time & Date Service...
Jan 30 04:06:21 np0005601978 systemd[1]: Started Time & Date Service.
Jan 30 04:06:21 np0005601978 python3.9[70420]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:22 np0005601978 python3.9[70572]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:22 np0005601978 python3.9[70695]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763982.0013957-886-91096685996440/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:23 np0005601978 python3.9[70847]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:23 np0005601978 python3.9[70970]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769763983.1151378-931-134721608884342/.source.yaml _original_basename=.77x1xv6a follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:24 np0005601978 python3.9[71122]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:25 np0005601978 python3.9[71245]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763984.2074215-976-259969449631853/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:25 np0005601978 python3.9[71397]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:06:26 np0005601978 python3.9[71550]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:06:27 np0005601978 python3[71703]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 30 04:06:27 np0005601978 python3.9[71855]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:28 np0005601978 python3.9[71978]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763987.368062-1093-264210496139917/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:29 np0005601978 python3.9[72130]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:29 np0005601978 python3.9[72253]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763988.5665944-1139-125736880152818/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:30 np0005601978 python3.9[72405]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:30 np0005601978 python3.9[72528]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763989.698619-1183-138339124744524/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:31 np0005601978 python3.9[72680]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:31 np0005601978 python3.9[72803]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763990.7932093-1228-268602329166966/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:32 np0005601978 python3.9[72955]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:06:32 np0005601978 python3.9[73078]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769763991.962858-1273-171671826034543/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:33 np0005601978 python3.9[73230]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:34 np0005601978 python3.9[73382]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:06:35 np0005601978 python3.9[73541]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:35 np0005601978 python3.9[73694]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:36 np0005601978 python3.9[73846]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:37 np0005601978 python3.9[73998]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 30 04:06:37 np0005601978 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:06:37 np0005601978 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:06:37 np0005601978 python3.9[74152]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 30 04:06:38 np0005601978 systemd[1]: session-16.scope: Deactivated successfully.
Jan 30 04:06:38 np0005601978 systemd[1]: session-16.scope: Consumed 28.022s CPU time.
Jan 30 04:06:38 np0005601978 systemd-logind[793]: Session 16 logged out. Waiting for processes to exit.
Jan 30 04:06:38 np0005601978 systemd-logind[793]: Removed session 16.
Jan 30 04:06:43 np0005601978 systemd-logind[793]: New session 17 of user zuul.
Jan 30 04:06:43 np0005601978 systemd[1]: Started Session 17 of User zuul.
Jan 30 04:06:44 np0005601978 python3.9[74333]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 30 04:06:45 np0005601978 python3.9[74485]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:06:46 np0005601978 python3.9[74637]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:06:47 np0005601978 python3.9[74789]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCU0CWF83KfXqSNJSVlw6CgEfyzQlVc+sQ44mcTHx301r7khNpqGqzzMl2b7B3g23zeOn2rm6WkcSpEWp9NX1E4FzlirH03uNb4wmfTil8FM0ijgykv8ayTE+qg8rqjVN/609BkbvtuXEXKvifnL2QLn5d86JoMfCX4sZANxlKS0zIXNNzOBWfSuQwG9mcnFwqhhkxuKK6MvSWgH2by+gVux3vL2E+9Hp4A5jNgMsfbW7Mq2euCLpntWO+yOZTGN9eLLPWiwuU0k+gM8FTa94oNFKnKgCvyJ1Yvg8J93qul2lVDzABn7E2fibM0HHLQtYqyzaUeT5cZ+wj8IAEo8jJAzLVNNiJ1oQMvubGWdIKqe4xqnCPaHoxgm9aZDfmS7jAuvdckG23zZ0JhSsWlrtN7xI9QZ810/hmRpikxGQLwKhiy9r8eoNZfI9KpodKz+Fe4hGBj9Q+HVg+jQ5arbPWHcmHTzigE8RTwfEaquYWbkoTPrSY91r+5IPJhckAV3bk=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPwxHcThrtnoKJGePVfRk58vCQwCWYpQ4iFWVlZQL0zh#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMPX+Rlv1VZVTcJHTQQZnMHb8kJKLhFtYoOy8z2Chbt2lwi784rpWFzb7jWcZongf3UHJoSP+5IOd0+d2b54PM0=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCwf6eb4Ud21yPWubeGZA8uqtixuwZzeZ5UoBadQadVnxVH49Nw/8ibZzt2wsZTIKtbNU2R+eHrDaNVB1QxJpVYXvYDUsG2RWKZCFKz2SZ1tEQam/R+2D1hXGc9qrD7+TEPDx2Wwrc1ss+ednMEHn2Gzy9CEjhMe4wcJF98yV52TUd6QHjOK8V+5pjSX18HaGbYe+l7oUb5mu8HWJkRVT5UlSWHFNksxYtLhhWMLshFBvIFNyHvIyYu6CSVwJJ2u6EbGORY9hnxfgn19lmHuOFr3KM1piYtNbTo6Y0A4ihCObgmnlyjzosioo92t/7T5zL3zC3HRHO8LUIfOvGRqe3ZSDDX7r7vc3u5p4dMxUnqag2BMfwS4Yz5s16GMDMWb1NfzKCsI0RGbIglZ8ZE7HWvULCMQGFbk0oAYI5B4+5b+/1b2ErJHvmhy77YeMDj+ZZkcJBXLVwyCUYYmyarfkr6dDJnuM7fkW5rOnfzcCketCIZCcQnpkBAKV/IudZE1/c=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIL/P3OiVbM10OAPjF9Amd7xZhZLxY4V0EuaAx+wvjSF/#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAV7AMUgUeqiy7iD31QxTURySlKR2fBxMPCblDeBG/dj+f9J1PQ8cZLAV3XTLdCxjk7cuvqZgV97OuUkxYig25g=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCRk+IxZ0/uyz964ul4FIalWLU+RmsFaMHhJTZwt8ByL08m86HRErkmNy7Z6oJd2fxocJ/fCs5OvEyX/ZUPGxV3Mdbt0IwdCMTAb7Z/A4w+ie2xsV62iidAd4Wi03Wd4m0C7UCv3kNtNk8rYqBZ89W5iMmhXXxZNrul7hmbDHACoG3cA9hefsGM0x52FXCZggf/iwYhbSE44ql/H+/TnZedC9ooElCLTzIV9JGpzCvErYQ8RGZ07EOfIRgSZ0Pa7mUwTRKgiGsi2KPaX6MUL4KHPQk3gBiwY38ibUR9mpxEnbCq5FQwaKLar2+KgMZAZ2v+iPXKa4V2nCL0MP8tZrtovt65vorRrmz7oWxyAwu/FMyHS9ogS9yeAH7pRFasZ7cru3FC3mHim5acBZnuiek8BCcTvxdsJFGlYEHwtiUKSUF3nysO3JfWii2Pe4WeVK63UdTfAaGeC7J+AV4a9BFTf98lUdXDM1osJ08Z6pWmnrCxASYHGYogUhwgL0NnRH0=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICurNT8B2BrfPznsK5CLFzT9Xwr6Yrnz0KCZMpcruyIL#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBF1KO3gRI6MewvlkrVvSUR1n2NF/WbCjfKMKcsninu/Qnl23QC5T9OSewdOY7mdImHiKVFMnjt5d4TIcXgyEQ+I=#012 create=True mode=0644 path=/tmp/ansible.t_a09r_c state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:48 np0005601978 python3.9[74941]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.t_a09r_c' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:06:49 np0005601978 python3.9[75095]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.t_a09r_c state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:06:49 np0005601978 systemd[1]: session-17.scope: Deactivated successfully.
Jan 30 04:06:49 np0005601978 systemd[1]: session-17.scope: Consumed 2.819s CPU time.
Jan 30 04:06:49 np0005601978 systemd-logind[793]: Session 17 logged out. Waiting for processes to exit.
Jan 30 04:06:49 np0005601978 systemd-logind[793]: Removed session 17.
Jan 30 04:06:51 np0005601978 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 30 04:06:54 np0005601978 systemd-logind[793]: New session 18 of user zuul.
Jan 30 04:06:54 np0005601978 systemd[1]: Started Session 18 of User zuul.
Jan 30 04:06:55 np0005601978 python3.9[75275]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:06:57 np0005601978 python3.9[75431]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 30 04:06:58 np0005601978 python3.9[75585]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:06:58 np0005601978 python3.9[75738]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:06:59 np0005601978 python3.9[75891]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:07:00 np0005601978 python3.9[76045]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:07:01 np0005601978 python3.9[76200]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:01 np0005601978 systemd[1]: session-18.scope: Deactivated successfully.
Jan 30 04:07:01 np0005601978 systemd[1]: session-18.scope: Consumed 4.075s CPU time.
Jan 30 04:07:01 np0005601978 systemd-logind[793]: Session 18 logged out. Waiting for processes to exit.
Jan 30 04:07:01 np0005601978 systemd-logind[793]: Removed session 18.
Jan 30 04:07:06 np0005601978 systemd-logind[793]: New session 19 of user zuul.
Jan 30 04:07:06 np0005601978 systemd[1]: Started Session 19 of User zuul.
Jan 30 04:07:07 np0005601978 python3.9[76378]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:07:08 np0005601978 python3.9[76534]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:07:09 np0005601978 python3.9[76618]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 30 04:07:11 np0005601978 python3.9[76769]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:07:13 np0005601978 python3.9[76920]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 30 04:07:13 np0005601978 python3.9[77070]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:07:14 np0005601978 python3.9[77220]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:07:14 np0005601978 systemd[1]: session-19.scope: Deactivated successfully.
Jan 30 04:07:14 np0005601978 systemd[1]: session-19.scope: Consumed 5.171s CPU time.
Jan 30 04:07:14 np0005601978 systemd-logind[793]: Session 19 logged out. Waiting for processes to exit.
Jan 30 04:07:14 np0005601978 systemd-logind[793]: Removed session 19.
Jan 30 04:07:19 np0005601978 systemd-logind[793]: New session 20 of user zuul.
Jan 30 04:07:19 np0005601978 systemd[1]: Started Session 20 of User zuul.
Jan 30 04:07:20 np0005601978 python3.9[77399]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:07:22 np0005601978 python3.9[77555]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:23 np0005601978 python3.9[77707]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:24 np0005601978 python3.9[77859]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:24 np0005601978 python3.9[77982]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764043.6463177-152-194187677395789/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=de926a07d3f08d7212f8678152f9fac0141b5d16 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:25 np0005601978 python3.9[78134]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:26 np0005601978 python3.9[78257]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764045.0260925-152-65279638095887/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=c4ac8da60190f1ab0f303a8bc23860222db62bf0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:26 np0005601978 python3.9[78409]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:27 np0005601978 python3.9[78532]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764046.3523655-152-153172384923465/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=bce7f0103ed3e9b93f46a30a205343e5d161cb24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:27 np0005601978 python3.9[78684]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:28 np0005601978 python3.9[78836]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:28 np0005601978 python3.9[78988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:29 np0005601978 python3.9[79111]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764048.5713153-333-115411305850929/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=7c706ed2decdbd5fec514cb9564ba7b1ebeabd6f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:29 np0005601978 python3.9[79263]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:30 np0005601978 python3.9[79386]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764049.4831805-333-177401434655331/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=20e5b9de999b0f369806d5bf4ae2497d7bf9e0ce backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:31 np0005601978 python3.9[79538]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:31 np0005601978 python3.9[79661]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764050.8778782-333-183807574109191/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=0a2001e660c9ede0fbfe54cbee9669f97af257a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:32 np0005601978 python3.9[79813]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:33 np0005601978 python3.9[79965]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:33 np0005601978 python3.9[80117]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:34 np0005601978 python3.9[80240]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764053.2266665-497-86161223428717/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=618a967236d15d02d0e9ccc078495767e264756c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:34 np0005601978 python3.9[80392]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:35 np0005601978 python3.9[80515]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764054.210502-497-278084177892076/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=ce5169879430fd2d3d983cf42225c6608c73f6d6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:35 np0005601978 python3.9[80667]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:36 np0005601978 python3.9[80790]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764055.5124876-497-45157753398939/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=bfcad57fcb9bbbf0c78dbf6f4c608b8223dbe27b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:37 np0005601978 python3.9[80942]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:37 np0005601978 python3.9[81094]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:38 np0005601978 python3.9[81246]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:38 np0005601978 python3.9[81369]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764057.9295504-683-196126948600643/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=49ad2192feebb2323d6373e3c3d254dd39c707bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:39 np0005601978 python3.9[81521]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:39 np0005601978 python3.9[81644]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764058.9333298-683-153372970618568/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=ce5169879430fd2d3d983cf42225c6608c73f6d6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:40 np0005601978 python3.9[81796]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:41 np0005601978 python3.9[81919]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764060.2006316-683-113819535503157/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=68dbbd29202fbf748a5907d2692445a7d1e5c86d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:42 np0005601978 python3.9[82071]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:42 np0005601978 python3.9[82223]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:43 np0005601978 python3.9[82346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764062.505239-876-20742866857592/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:43 np0005601978 python3.9[82498]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:44 np0005601978 python3.9[82650]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:45 np0005601978 python3.9[82773]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764064.1091292-945-95631094061639/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:45 np0005601978 python3.9[82925]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:46 np0005601978 python3.9[83077]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:46 np0005601978 python3.9[83200]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764065.9922385-1015-224538235094737/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:47 np0005601978 python3.9[83352]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:48 np0005601978 python3.9[83504]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:48 np0005601978 python3.9[83627]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764067.848605-1090-134791629788550/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:49 np0005601978 python3.9[83779]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:50 np0005601978 python3.9[83931]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:50 np0005601978 python3.9[84054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764069.533842-1157-107274030546161/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:51 np0005601978 python3.9[84206]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:52 np0005601978 python3.9[84358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:52 np0005601978 python3.9[84481]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764071.6159427-1227-50400182496742/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:53 np0005601978 python3.9[84633]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:07:53 np0005601978 python3.9[84785]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:07:54 np0005601978 python3.9[84908]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764073.3454452-1302-704974975529/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f67ef16f0caa82a36466163efc630d4be8f81ef5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:07:55 np0005601978 chronyd[64963]: Selected source 198.161.203.36 (pool.ntp.org)
Jan 30 04:07:55 np0005601978 systemd[1]: session-20.scope: Deactivated successfully.
Jan 30 04:07:55 np0005601978 systemd[1]: session-20.scope: Consumed 24.147s CPU time.
Jan 30 04:07:55 np0005601978 systemd-logind[793]: Session 20 logged out. Waiting for processes to exit.
Jan 30 04:07:55 np0005601978 systemd-logind[793]: Removed session 20.
Jan 30 04:08:01 np0005601978 systemd-logind[793]: New session 21 of user zuul.
Jan 30 04:08:01 np0005601978 systemd[1]: Started Session 21 of User zuul.
Jan 30 04:08:02 np0005601978 python3.9[85089]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:08:03 np0005601978 python3.9[85245]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:04 np0005601978 python3.9[85397]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:05 np0005601978 python3.9[85547]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:08:06 np0005601978 python3.9[85699]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 30 04:08:08 np0005601978 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 30 04:08:08 np0005601978 python3.9[85855]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:08:09 np0005601978 python3.9[85939]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:08:11 np0005601978 python3.9[86092]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:08:12 np0005601978 python3[86247]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 30 04:08:13 np0005601978 python3.9[86399]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:13 np0005601978 python3.9[86551]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:14 np0005601978 python3.9[86629]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:14 np0005601978 python3.9[86781]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:15 np0005601978 python3.9[86859]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.bh2pfndl recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:16 np0005601978 python3.9[87011]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:16 np0005601978 python3.9[87089]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:17 np0005601978 python3.9[87241]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:08:18 np0005601978 python3[87394]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 30 04:08:19 np0005601978 python3.9[87546]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:19 np0005601978 python3.9[87671]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764098.6312425-427-253635207033886/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:20 np0005601978 python3.9[87823]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:20 np0005601978 python3.9[87948]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764099.9433494-472-236956233181498/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:21 np0005601978 python3.9[88100]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:22 np0005601978 python3.9[88225]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764101.246869-517-220205346336459/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:22 np0005601978 python3.9[88377]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:23 np0005601978 python3.9[88502]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764102.4771464-562-259183809564475/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:24 np0005601978 python3.9[88654]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:24 np0005601978 python3.9[88779]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764103.731984-607-83578061048470/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:25 np0005601978 python3.9[88931]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:26 np0005601978 python3.9[89083]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:08:26 np0005601978 python3.9[89238]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:27 np0005601978 python3.9[89390]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:08:28 np0005601978 python3.9[89543]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:08:28 np0005601978 python3.9[89697]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:08:29 np0005601978 python3.9[89852]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:30 np0005601978 python3.9[90002]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:08:31 np0005601978 python3.9[90155]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:9e:41:65:cf" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:08:31 np0005601978 ovs-vsctl[90156]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:9e:41:65:cf external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 30 04:08:32 np0005601978 python3.9[90308]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:08:33 np0005601978 python3.9[90463]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:08:33 np0005601978 ovs-vsctl[90464]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 30 04:08:33 np0005601978 python3.9[90614]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:08:34 np0005601978 python3.9[90768]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:35 np0005601978 python3.9[90920]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:35 np0005601978 python3.9[90998]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:36 np0005601978 python3.9[91150]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:36 np0005601978 python3.9[91228]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:37 np0005601978 python3.9[91380]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:37 np0005601978 python3.9[91532]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:38 np0005601978 python3.9[91610]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:38 np0005601978 python3.9[91762]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:39 np0005601978 python3.9[91840]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:39 np0005601978 python3.9[91992]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:08:39 np0005601978 systemd[1]: Reloading.
Jan 30 04:08:39 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:08:39 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:08:40 np0005601978 python3.9[92181]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:41 np0005601978 python3.9[92259]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:41 np0005601978 python3.9[92411]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:42 np0005601978 python3.9[92489]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:43 np0005601978 python3.9[92642]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:08:43 np0005601978 systemd[1]: Reloading.
Jan 30 04:08:43 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:08:43 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:08:43 np0005601978 systemd[1]: Starting Create netns directory...
Jan 30 04:08:43 np0005601978 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 30 04:08:43 np0005601978 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 30 04:08:43 np0005601978 systemd[1]: Finished Create netns directory.
Jan 30 04:08:44 np0005601978 python3.9[92836]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:44 np0005601978 python3.9[92988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:45 np0005601978 python3.9[93111]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764124.3838096-1360-190982093184069/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:46 np0005601978 python3.9[93263]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:47 np0005601978 python3.9[93415]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:08:47 np0005601978 python3.9[93567]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:08:48 np0005601978 python3.9[93690]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764127.4514008-1459-186056564907381/.source.json _original_basename=.yn7ok6d4 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:49 np0005601978 python3.9[93840]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:51 np0005601978 python3.9[94263]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 30 04:08:52 np0005601978 python3.9[94415]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 30 04:08:53 np0005601978 python3[94567]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 30 04:08:53 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:08:53 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:08:53 np0005601978 podman[94604]: 2026-01-30 09:08:53.787634942 +0000 UTC m=+0.037374349 container create 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 30 04:08:53 np0005601978 podman[94604]: 2026-01-30 09:08:53.767200885 +0000 UTC m=+0.016940332 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 30 04:08:53 np0005601978 python3[94567]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 30 04:08:54 np0005601978 python3.9[94795]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:08:54 np0005601978 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 30 04:08:55 np0005601978 python3.9[94949]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:55 np0005601978 python3.9[95025]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:08:56 np0005601978 python3.9[95176]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769764135.6856854-1693-151574057855488/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:08:57 np0005601978 python3.9[95252]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:08:57 np0005601978 systemd[1]: Reloading.
Jan 30 04:08:57 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:08:57 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:08:58 np0005601978 python3.9[95363]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:08:58 np0005601978 systemd[1]: Reloading.
Jan 30 04:08:58 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:08:58 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:08:58 np0005601978 systemd[1]: Starting ovn_controller container...
Jan 30 04:08:58 np0005601978 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 30 04:08:58 np0005601978 systemd[1]: Started libcrun container.
Jan 30 04:08:58 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54dd0f0f4e030841193ea7e680b45a56927d0265da870ccbff5363702ddb86ae/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 30 04:08:58 np0005601978 systemd[1]: Started /usr/bin/podman healthcheck run 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089.
Jan 30 04:08:58 np0005601978 podman[95403]: 2026-01-30 09:08:58.489235661 +0000 UTC m=+0.120051141 container init 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: + sudo -E kolla_set_configs
Jan 30 04:08:58 np0005601978 podman[95403]: 2026-01-30 09:08:58.514050385 +0000 UTC m=+0.144865815 container start 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 30 04:08:58 np0005601978 edpm-start-podman-container[95403]: ovn_controller
Jan 30 04:08:58 np0005601978 systemd[1]: Created slice User Slice of UID 0.
Jan 30 04:08:58 np0005601978 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 30 04:08:58 np0005601978 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 30 04:08:58 np0005601978 systemd[1]: Starting User Manager for UID 0...
Jan 30 04:08:58 np0005601978 edpm-start-podman-container[95402]: Creating additional drop-in dependency for "ovn_controller" (4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089)
Jan 30 04:08:58 np0005601978 podman[95426]: 2026-01-30 09:08:58.588367692 +0000 UTC m=+0.064364646 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:08:58 np0005601978 systemd[1]: 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089-3e9df801eb05be27.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:08:58 np0005601978 systemd[1]: 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089-3e9df801eb05be27.service: Failed with result 'exit-code'.
Jan 30 04:08:58 np0005601978 systemd[1]: Reloading.
Jan 30 04:08:58 np0005601978 systemd[95458]: Queued start job for default target Main User Target.
Jan 30 04:08:58 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:08:58 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:08:58 np0005601978 systemd[95458]: Created slice User Application Slice.
Jan 30 04:08:58 np0005601978 systemd[95458]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 30 04:08:58 np0005601978 systemd[95458]: Started Daily Cleanup of User's Temporary Directories.
Jan 30 04:08:58 np0005601978 systemd[95458]: Reached target Paths.
Jan 30 04:08:58 np0005601978 systemd[95458]: Reached target Timers.
Jan 30 04:08:58 np0005601978 systemd[95458]: Starting D-Bus User Message Bus Socket...
Jan 30 04:08:58 np0005601978 systemd[95458]: Starting Create User's Volatile Files and Directories...
Jan 30 04:08:58 np0005601978 systemd[95458]: Finished Create User's Volatile Files and Directories.
Jan 30 04:08:58 np0005601978 systemd[95458]: Listening on D-Bus User Message Bus Socket.
Jan 30 04:08:58 np0005601978 systemd[95458]: Reached target Sockets.
Jan 30 04:08:58 np0005601978 systemd[95458]: Reached target Basic System.
Jan 30 04:08:58 np0005601978 systemd[95458]: Reached target Main User Target.
Jan 30 04:08:58 np0005601978 systemd[95458]: Startup finished in 91ms.
Jan 30 04:08:58 np0005601978 systemd[1]: Started User Manager for UID 0.
Jan 30 04:08:58 np0005601978 systemd[1]: Started ovn_controller container.
Jan 30 04:08:58 np0005601978 systemd[1]: Started Session c1 of User root.
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: INFO:__main__:Validating config file
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: INFO:__main__:Writing out command to execute
Jan 30 04:08:58 np0005601978 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: ++ cat /run_command
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: + ARGS=
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: + sudo kolla_copy_cacerts
Jan 30 04:08:58 np0005601978 systemd[1]: Started Session c2 of User root.
Jan 30 04:08:58 np0005601978 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: + [[ ! -n '' ]]
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: + . kolla_extend_start
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: + umask 0022
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: 2026-01-30T09:08:58Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: 2026-01-30T09:08:58Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: 2026-01-30T09:08:58Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: 2026-01-30T09:08:58Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: 2026-01-30T09:08:58Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 30 04:08:58 np0005601978 ovn_controller[95419]: 2026-01-30T09:08:58Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 30 04:08:58 np0005601978 NetworkManager[55525]: <info>  [1769764138.9411] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 30 04:08:58 np0005601978 NetworkManager[55525]: <info>  [1769764138.9420] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 30 04:08:58 np0005601978 NetworkManager[55525]: <warn>  [1769764138.9423] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 30 04:08:58 np0005601978 NetworkManager[55525]: <info>  [1769764138.9430] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 30 04:08:58 np0005601978 NetworkManager[55525]: <info>  [1769764138.9436] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 30 04:08:58 np0005601978 NetworkManager[55525]: <info>  [1769764138.9440] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 30 04:08:58 np0005601978 kernel: br-int: entered promiscuous mode
Jan 30 04:08:58 np0005601978 systemd-udevd[95553]: Network interface NamePolicy= disabled on kernel command line.
Jan 30 04:08:59 np0005601978 python3.9[95681]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 30 04:08:59 np0005601978 ovn_controller[95419]: 2026-01-30T09:08:59Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out
Jan 30 04:09:00 np0005601978 ovn_controller[95419]: 2026-01-30T09:09:00Z|00008|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 30 04:09:01 np0005601978 python3.9[95833]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:01 np0005601978 python3.9[95956]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764140.5582986-1828-19284473492612/.source.yaml _original_basename=.oenyoq7_ follow=False checksum=ec333544c79641cd730121880e32bc9e0db5fd7e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:09:01Z|00009|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out
Jan 30 04:09:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:09:01Z|00010|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect
Jan 30 04:09:02 np0005601978 python3.9[96108]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:09:02 np0005601978 ovs-vsctl[96109]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 30 04:09:03 np0005601978 python3.9[96261]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:09:03 np0005601978 ovs-vsctl[96263]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 30 04:09:03 np0005601978 ovn_controller[95419]: 2026-01-30T09:09:03Z|00011|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 30 04:09:04 np0005601978 python3.9[96416]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:09:04 np0005601978 ovs-vsctl[96417]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 30 04:09:04 np0005601978 systemd[1]: session-21.scope: Deactivated successfully.
Jan 30 04:09:04 np0005601978 systemd-logind[793]: Session 21 logged out. Waiting for processes to exit.
Jan 30 04:09:04 np0005601978 systemd[1]: session-21.scope: Consumed 39.421s CPU time.
Jan 30 04:09:04 np0005601978 systemd-logind[793]: Removed session 21.
Jan 30 04:09:05 np0005601978 ovn_controller[95419]: 2026-01-30T09:09:05Z|00012|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out
Jan 30 04:09:05 np0005601978 ovn_controller[95419]: 2026-01-30T09:09:05Z|00013|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect
Jan 30 04:09:09 np0005601978 systemd[1]: Stopping User Manager for UID 0...
Jan 30 04:09:09 np0005601978 systemd[95458]: Activating special unit Exit the Session...
Jan 30 04:09:09 np0005601978 systemd[95458]: Stopped target Main User Target.
Jan 30 04:09:09 np0005601978 systemd[95458]: Stopped target Basic System.
Jan 30 04:09:09 np0005601978 systemd[95458]: Stopped target Paths.
Jan 30 04:09:09 np0005601978 systemd[95458]: Stopped target Sockets.
Jan 30 04:09:09 np0005601978 systemd[95458]: Stopped target Timers.
Jan 30 04:09:09 np0005601978 systemd[95458]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 30 04:09:09 np0005601978 systemd[95458]: Closed D-Bus User Message Bus Socket.
Jan 30 04:09:09 np0005601978 systemd[95458]: Stopped Create User's Volatile Files and Directories.
Jan 30 04:09:09 np0005601978 systemd[95458]: Removed slice User Application Slice.
Jan 30 04:09:09 np0005601978 systemd[95458]: Reached target Shutdown.
Jan 30 04:09:09 np0005601978 systemd[95458]: Finished Exit the Session.
Jan 30 04:09:09 np0005601978 systemd[95458]: Reached target Exit the Session.
Jan 30 04:09:09 np0005601978 systemd[1]: user@0.service: Deactivated successfully.
Jan 30 04:09:09 np0005601978 systemd[1]: Stopped User Manager for UID 0.
Jan 30 04:09:09 np0005601978 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 30 04:09:09 np0005601978 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 30 04:09:09 np0005601978 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 30 04:09:09 np0005601978 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 30 04:09:09 np0005601978 systemd[1]: Removed slice User Slice of UID 0.
Jan 30 04:09:09 np0005601978 ovn_controller[95419]: 2026-01-30T09:09:09Z|00014|memory|INFO|15424 kB peak resident set size after 11.0 seconds
Jan 30 04:09:09 np0005601978 ovn_controller[95419]: 2026-01-30T09:09:09Z|00015|memory|INFO|idl-cells-Open_vSwitch:408
Jan 30 04:09:09 np0005601978 ovn_controller[95419]: 2026-01-30T09:09:09Z|00016|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 30 04:09:10 np0005601978 systemd-logind[793]: New session 23 of user zuul.
Jan 30 04:09:10 np0005601978 systemd[1]: Started Session 23 of User zuul.
Jan 30 04:09:11 np0005601978 python3.9[96597]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:09:12 np0005601978 python3.9[96753]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:13 np0005601978 python3.9[96905]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:13 np0005601978 python3.9[97057]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:13 np0005601978 ovn_controller[95419]: 2026-01-30T09:09:13Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out
Jan 30 04:09:13 np0005601978 ovn_controller[95419]: 2026-01-30T09:09:13Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging
Jan 30 04:09:14 np0005601978 python3.9[97209]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:14 np0005601978 python3.9[97361]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:15 np0005601978 python3.9[97511]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:09:16 np0005601978 python3.9[97663]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 30 04:09:18 np0005601978 python3.9[97813]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:19 np0005601978 python3.9[97934]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764157.9151065-214-140110346819134/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:19 np0005601978 python3.9[98084]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:20 np0005601978 python3.9[98205]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764159.3099976-259-13838719372701/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:21 np0005601978 python3.9[98357]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:09:22 np0005601978 python3.9[98441]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:09:24 np0005601978 python3.9[98594]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:09:25 np0005601978 python3.9[98747]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:26 np0005601978 python3.9[98868]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764165.2386785-370-1128567281299/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:26 np0005601978 python3.9[99018]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:27 np0005601978 python3.9[99139]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764166.2899947-370-37478612075847/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:28 np0005601978 python3.9[99289]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:28 np0005601978 podman[99384]: 2026-01-30 09:09:28.856457148 +0000 UTC m=+0.064453848 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=2, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 30 04:09:28 np0005601978 systemd[1]: 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089-3e9df801eb05be27.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:09:28 np0005601978 systemd[1]: 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089-3e9df801eb05be27.service: Failed with result 'exit-code'.
Jan 30 04:09:28 np0005601978 python3.9[99422]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764168.1069038-502-183235159340385/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:29 np0005601978 python3.9[99580]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:30 np0005601978 python3.9[99701]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764169.3458035-502-278162141489412/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:30 np0005601978 python3.9[99851]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:09:31 np0005601978 python3.9[100005]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:32 np0005601978 python3.9[100157]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:32 np0005601978 python3.9[100235]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:33 np0005601978 python3.9[100387]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:33 np0005601978 python3.9[100465]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:34 np0005601978 python3.9[100617]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:35 np0005601978 python3.9[100769]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:35 np0005601978 python3.9[100847]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:36 np0005601978 python3.9[100999]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:36 np0005601978 python3.9[101077]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:37 np0005601978 python3.9[101229]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:09:37 np0005601978 systemd[1]: Reloading.
Jan 30 04:09:37 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:09:37 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:09:38 np0005601978 python3.9[101419]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:38 np0005601978 python3.9[101497]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:39 np0005601978 python3.9[101649]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:39 np0005601978 python3.9[101727]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:40 np0005601978 python3.9[101879]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:09:40 np0005601978 systemd[1]: Reloading.
Jan 30 04:09:40 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:09:40 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:09:41 np0005601978 systemd[1]: Starting Create netns directory...
Jan 30 04:09:41 np0005601978 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 30 04:09:41 np0005601978 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 30 04:09:41 np0005601978 systemd[1]: Finished Create netns directory.
Jan 30 04:09:41 np0005601978 python3.9[102071]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:42 np0005601978 python3.9[102223]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:43 np0005601978 python3.9[102346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764182.2419636-955-85146690133639/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:44 np0005601978 python3.9[102498]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:44 np0005601978 python3.9[102650]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:09:45 np0005601978 python3.9[102802]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:45 np0005601978 python3.9[102925]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764185.1532586-1054-174370208713423/.source.json _original_basename=.0u5ggrlg follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:46 np0005601978 python3.9[103075]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:48 np0005601978 python3.9[103498]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 30 04:09:49 np0005601978 python3.9[103650]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 30 04:09:50 np0005601978 python3[103802]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 30 04:09:51 np0005601978 podman[103838]: 2026-01-30 09:09:51.139370432 +0000 UTC m=+0.055071875 container create 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 30 04:09:51 np0005601978 podman[103838]: 2026-01-30 09:09:51.107539207 +0000 UTC m=+0.023240720 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:09:51 np0005601978 python3[103802]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 30 04:09:52 np0005601978 python3.9[104028]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:09:52 np0005601978 python3.9[104182]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:53 np0005601978 python3.9[104258]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:09:53 np0005601978 python3.9[104409]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769764193.3229215-1288-247151833763446/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:54 np0005601978 python3.9[104485]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:09:54 np0005601978 systemd[1]: Reloading.
Jan 30 04:09:54 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:09:54 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:09:55 np0005601978 python3.9[104596]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:09:55 np0005601978 systemd[1]: Reloading.
Jan 30 04:09:55 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:09:55 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:09:55 np0005601978 systemd[1]: Starting ovn_metadata_agent container...
Jan 30 04:09:55 np0005601978 systemd[1]: Started libcrun container.
Jan 30 04:09:55 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44dec926090c6d0bef4b346ecd9d388b77c53d1daa13c252b40a422c3bf6ae3f/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 30 04:09:55 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44dec926090c6d0bef4b346ecd9d388b77c53d1daa13c252b40a422c3bf6ae3f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 30 04:09:55 np0005601978 systemd[1]: Started /usr/bin/podman healthcheck run 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959.
Jan 30 04:09:55 np0005601978 podman[104637]: 2026-01-30 09:09:55.500432966 +0000 UTC m=+0.128226109 container init 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: + sudo -E kolla_set_configs
Jan 30 04:09:55 np0005601978 podman[104637]: 2026-01-30 09:09:55.519655824 +0000 UTC m=+0.147448917 container start 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 30 04:09:55 np0005601978 edpm-start-podman-container[104637]: ovn_metadata_agent
Jan 30 04:09:55 np0005601978 edpm-start-podman-container[104636]: Creating additional drop-in dependency for "ovn_metadata_agent" (6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959)
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: INFO:__main__:Validating config file
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: INFO:__main__:Copying service configuration files
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: INFO:__main__:Writing out command to execute
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: ++ cat /run_command
Jan 30 04:09:55 np0005601978 systemd[1]: Reloading.
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: + CMD=neutron-ovn-metadata-agent
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: + ARGS=
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: + sudo kolla_copy_cacerts
Jan 30 04:09:55 np0005601978 podman[104658]: 2026-01-30 09:09:55.602106 +0000 UTC m=+0.070937485 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: + [[ ! -n '' ]]
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: + . kolla_extend_start
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: Running command: 'neutron-ovn-metadata-agent'
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: + umask 0022
Jan 30 04:09:55 np0005601978 ovn_metadata_agent[104652]: + exec neutron-ovn-metadata-agent
Jan 30 04:09:55 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:09:55 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:09:55 np0005601978 systemd[1]: Started ovn_metadata_agent container.
Jan 30 04:09:57 np0005601978 python3.9[104890]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.250 104657 INFO neutron.common.config [-] Logging enabled!#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.250 104657 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.250 104657 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.251 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.251 104657 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.251 104657 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.251 104657 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.251 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.251 104657 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.251 104657 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.252 104657 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.252 104657 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.252 104657 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.252 104657 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.252 104657 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.252 104657 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.252 104657 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.252 104657 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.252 104657 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.253 104657 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.253 104657 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.253 104657 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.253 104657 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.253 104657 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.253 104657 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.253 104657 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.253 104657 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.253 104657 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.253 104657 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.253 104657 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.254 104657 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.254 104657 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.254 104657 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.254 104657 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.254 104657 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.254 104657 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.254 104657 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.254 104657 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.255 104657 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.255 104657 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.255 104657 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.255 104657 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.255 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.255 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.255 104657 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.255 104657 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.255 104657 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.255 104657 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.255 104657 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.256 104657 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.256 104657 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.256 104657 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.256 104657 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.256 104657 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.256 104657 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.256 104657 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.256 104657 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.256 104657 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.257 104657 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.257 104657 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.257 104657 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.257 104657 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.257 104657 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.257 104657 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.257 104657 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.257 104657 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.257 104657 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.257 104657 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.258 104657 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.258 104657 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.258 104657 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.258 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.258 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.258 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.258 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.258 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.258 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.259 104657 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.259 104657 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.259 104657 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.259 104657 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.259 104657 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.259 104657 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.259 104657 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.259 104657 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.259 104657 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.259 104657 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.260 104657 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.260 104657 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.260 104657 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.260 104657 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.260 104657 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.260 104657 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.260 104657 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.260 104657 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.260 104657 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.260 104657 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.261 104657 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.261 104657 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.261 104657 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.261 104657 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.261 104657 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.261 104657 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.261 104657 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.261 104657 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.261 104657 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.261 104657 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.262 104657 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.262 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.262 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.262 104657 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.262 104657 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.262 104657 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.262 104657 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.262 104657 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.262 104657 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.263 104657 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.263 104657 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.263 104657 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.263 104657 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.263 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.263 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.263 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.263 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.263 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.264 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.264 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.264 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.264 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.264 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.264 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.264 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.264 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.264 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.265 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.265 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.265 104657 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.265 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.265 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.265 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.265 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.265 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.265 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.266 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.266 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.266 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.266 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.266 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.266 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.266 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.266 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.266 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.267 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.267 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.267 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.267 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.267 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.267 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.267 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.267 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.267 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.267 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.268 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.268 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.268 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.268 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.268 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.268 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.268 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.268 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.268 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.269 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.269 104657 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.269 104657 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.269 104657 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.269 104657 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.269 104657 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.269 104657 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.269 104657 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.269 104657 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.269 104657 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.270 104657 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.270 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.270 104657 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.270 104657 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.270 104657 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.270 104657 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.270 104657 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.270 104657 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.270 104657 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.270 104657 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.271 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.271 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.271 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.271 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.271 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.271 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.271 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.271 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.271 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.272 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.272 104657 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.272 104657 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.272 104657 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.272 104657 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.272 104657 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.272 104657 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.272 104657 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.272 104657 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.272 104657 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.273 104657 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.273 104657 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.273 104657 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.273 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.273 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.273 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.273 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.273 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.273 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.273 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.274 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.274 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.274 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.274 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.274 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.274 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.274 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.274 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.274 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.275 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.275 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.275 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.275 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.275 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.275 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.275 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.275 104657 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.275 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.275 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.276 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.276 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.276 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.276 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.276 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.276 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.276 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.276 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.276 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.277 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.277 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.277 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.277 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.277 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.277 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.277 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.277 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.278 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.278 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.278 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.278 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.278 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.278 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.278 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.278 104657 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.279 104657 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.279 104657 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.279 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.279 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.279 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.279 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.279 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.279 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.279 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.280 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.280 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.280 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.280 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.280 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.280 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.280 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.280 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.281 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.281 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.281 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.281 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.281 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.281 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.281 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.281 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.281 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.282 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.282 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.282 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.282 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.282 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.282 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.282 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.282 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.283 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.283 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.283 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.283 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.283 104657 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.283 104657 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.292 104657 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.292 104657 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.293 104657 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.293 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.293 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Jan 30 04:09:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:09:57.300 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 9803b804-d88a-4443-b777-6ecddbb75ed8 (UUID: 9803b804-d88a-4443-b777-6ecddbb75ed8) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Jan 30 04:09:58 np0005601978 python3.9[105042]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:09:58 np0005601978 python3.9[105167]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764197.756432-1423-145958407261160/.source.yaml _original_basename=.meixts9e follow=False checksum=3b6fe052ce520a89275a36a1ba4ff1848cf43bed backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:09:59 np0005601978 podman[105192]: 2026-01-30 09:09:59.416273878 +0000 UTC m=+0.077495615 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=unhealthy, health_failing_streak=3, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 30 04:09:59 np0005601978 systemd[1]: 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089-3e9df801eb05be27.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:09:59 np0005601978 systemd[1]: 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089-3e9df801eb05be27.service: Failed with result 'exit-code'.
Jan 30 04:09:59 np0005601978 systemd[1]: session-23.scope: Deactivated successfully.
Jan 30 04:09:59 np0005601978 systemd[1]: session-23.scope: Consumed 29.601s CPU time.
Jan 30 04:09:59 np0005601978 systemd-logind[793]: Session 23 logged out. Waiting for processes to exit.
Jan 30 04:09:59 np0005601978 systemd-logind[793]: Removed session 23.
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.374 104657 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.374 104657 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.375 104657 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.375 104657 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.378 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.385 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.391 104657 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f92c56f70a0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.392 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.393 104657 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.393 104657 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.393 104657 INFO oslo_service.service [-] Starting 1 workers#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.396 104657 DEBUG oslo_service.service [-] Started child 105213 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.398 104657 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpv1o5jdkn/privsep.sock']#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.398 105213 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-1013538'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.420 105213 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.420 105213 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.421 105213 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.424 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.430 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:10:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.435 105213 INFO eventlet.wsgi.server [-] (105213) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Jan 30 04:10:00 np0005601978 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.020 104657 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.022 104657 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpv1o5jdkn/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.917 105218 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.921 105218 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.924 105218 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:00.924 105218 INFO oslo.privsep.daemon [-] privsep daemon running as pid 105218#033[00m
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.024 105218 DEBUG oslo.privsep.daemon [-] privsep: reply[fd23101f-07bc-415b-835e-232ef8386151]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00019|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00020|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00021|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00022|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00023|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00024|features|INFO|OVS Feature: ct_flush, state: supported
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00025|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00026|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00027|main|INFO|OVS feature set changed, force recompute.
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00028|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00029|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00030|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00031|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00032|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00033|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00034|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00035|main|INFO|OVS feature set changed, force recompute.
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00036|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 30 04:10:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:01Z|00037|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.473 105218 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.473 105218 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.473 105218 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.971 105218 DEBUG oslo.privsep.daemon [-] privsep: reply[8ef7d389-b8da-4889-9508-eff3496cf69c]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.973 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:01.974 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.984 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.985 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:02.986 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:10:04 np0005601978 systemd-logind[793]: New session 24 of user zuul.
Jan 30 04:10:04 np0005601978 systemd[1]: Started Session 24 of User zuul.
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.990 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:04.992 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:10:05 np0005601978 python3.9[105376]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:10:06 np0005601978 python3.9[105532]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:08 np0005601978 python3.9[105697]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:10:08 np0005601978 systemd[1]: Reloading.
Jan 30 04:10:08 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:10:08 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:08.997 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:08.999 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:09.000 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:10:09 np0005601978 python3.9[105882]: ansible-ansible.builtin.service_facts Invoked
Jan 30 04:10:09 np0005601978 network[105899]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 30 04:10:09 np0005601978 network[105900]: 'network-scripts' will be removed from distribution in near future.
Jan 30 04:10:09 np0005601978 network[105901]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 30 04:10:14 np0005601978 python3.9[106162]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:10:15 np0005601978 python3.9[106315]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:10:16 np0005601978 python3.9[106468]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:10:16 np0005601978 python3.9[106621]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.005 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:17.006 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:10:17 np0005601978 python3.9[106774]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:10:18 np0005601978 python3.9[106927]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:10:18 np0005601978 python3.9[107080]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:10:19 np0005601978 python3.9[107233]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:20 np0005601978 python3.9[107385]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:21 np0005601978 python3.9[107537]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:21 np0005601978 python3.9[107689]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:22 np0005601978 python3.9[107841]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:22 np0005601978 python3.9[107993]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:23 np0005601978 python3.9[108145]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:24 np0005601978 python3.9[108297]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:24 np0005601978 python3.9[108449]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:25 np0005601978 python3.9[108601]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:25 np0005601978 python3.9[108753]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:26 np0005601978 podman[108877]: 2026-01-30 09:10:26.330350125 +0000 UTC m=+0.085820516 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:10:26 np0005601978 python3.9[108920]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:27 np0005601978 python3.9[109076]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:27 np0005601978 python3.9[109228]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:10:28 np0005601978 python3.9[109380]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:29 np0005601978 python3.9[109532]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 30 04:10:29 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:29Z|00038|chassis|WARN|Dropped 12 log messages in last 29 seconds (most recently, 21 seconds ago) due to excessive rate
Jan 30 04:10:29 np0005601978 ovn_controller[95419]: 2026-01-30T09:10:29Z|00039|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:10:29 np0005601978 podman[109656]: 2026-01-30 09:10:29.88898304 +0000 UTC m=+0.099819673 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:10:30 np0005601978 python3.9[109703]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:10:30 np0005601978 systemd[1]: Reloading.
Jan 30 04:10:30 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:10:30 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:10:31 np0005601978 python3.9[109898]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:31 np0005601978 python3.9[110051]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:32 np0005601978 python3.9[110204]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:32 np0005601978 python3.9[110357]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.008 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:10:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:33.009 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:10:33 np0005601978 python3.9[110510]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:33 np0005601978 python3.9[110663]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:34 np0005601978 python3.9[110816]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:10:35 np0005601978 python3.9[110969]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 30 04:10:36 np0005601978 python3.9[111122]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 30 04:10:37 np0005601978 python3.9[111280]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 30 04:10:38 np0005601978 python3.9[111440]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:10:39 np0005601978 python3.9[111524]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:10:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:57.298 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:10:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:57.299 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:10:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:10:57.299 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:10:57 np0005601978 podman[111713]: 2026-01-30 09:10:57.44165336 +0000 UTC m=+0.077057102 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:11:00 np0005601978 ovn_controller[95419]: 2026-01-30T09:11:00Z|00040|chassis|WARN|Dropped 2 log messages in last 30 seconds (most recently, 30 seconds ago) due to excessive rate
Jan 30 04:11:00 np0005601978 ovn_controller[95419]: 2026-01-30T09:11:00Z|00041|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:11:00 np0005601978 podman[111734]: 2026-01-30 09:11:00.479668076 +0000 UTC m=+0.135065045 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.012 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:11:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:05.013 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:11:09 np0005601978 kernel: SELinux:  Converting 2765 SID table entries...
Jan 30 04:11:09 np0005601978 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 04:11:09 np0005601978 kernel: SELinux:  policy capability open_perms=1
Jan 30 04:11:09 np0005601978 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 04:11:09 np0005601978 kernel: SELinux:  policy capability always_check_network=0
Jan 30 04:11:09 np0005601978 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 04:11:09 np0005601978 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 04:11:09 np0005601978 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 04:11:20 np0005601978 kernel: SELinux:  Converting 2765 SID table entries...
Jan 30 04:11:20 np0005601978 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 04:11:20 np0005601978 kernel: SELinux:  policy capability open_perms=1
Jan 30 04:11:20 np0005601978 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 04:11:20 np0005601978 kernel: SELinux:  policy capability always_check_network=0
Jan 30 04:11:20 np0005601978 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 04:11:20 np0005601978 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 04:11:20 np0005601978 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 04:11:28 np0005601978 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 30 04:11:28 np0005601978 podman[111775]: 2026-01-30 09:11:28.43412634 +0000 UTC m=+0.067392362 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 30 04:11:31 np0005601978 ovn_controller[95419]: 2026-01-30T09:11:31Z|00042|chassis|WARN|Dropped 5 log messages in last 31 seconds (most recently, 30 seconds ago) due to excessive rate
Jan 30 04:11:31 np0005601978 ovn_controller[95419]: 2026-01-30T09:11:31Z|00043|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:11:31 np0005601978 podman[112056]: 2026-01-30 09:11:31.443657223 +0000 UTC m=+0.102157853 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 30 04:11:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:57.302 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:11:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:57.303 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:11:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:11:57.303 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:11:59 np0005601978 podman[128687]: 2026-01-30 09:11:59.451309282 +0000 UTC m=+0.095030331 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:12:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:00.398 104657 ERROR ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: no response to inactivity probe after 60 seconds, disconnecting#033[00m
Jan 30 04:12:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:00.399 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped#033[00m
Jan 30 04:12:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:00.503 105213 ERROR ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: no response to inactivity probe after 60 seconds, disconnecting#033[00m
Jan 30 04:12:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:00.503 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped#033[00m
Jan 30 04:12:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:12:01Z|00044|chassis|WARN|Dropped 1 log messages in last 30 seconds (most recently, 30 seconds ago) due to excessive rate
Jan 30 04:12:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:12:01Z|00045|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:12:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:12:01Z|00046|reconnect|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: no response to inactivity probe after 60 seconds, disconnecting
Jan 30 04:12:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:12:01Z|00047|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped
Jan 30 04:12:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:12:01Z|00048|main|INFO|OVNSB commit failed, force recompute next time.
Jan 30 04:12:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:01.411 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:12:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:01.421 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:12:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:01.511 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:12:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:01.520 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:12:02 np0005601978 podman[128709]: 2026-01-30 09:12:02.446921698 +0000 UTC m=+0.100499052 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=1, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 30 04:12:02 np0005601978 systemd[1]: 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089-3e9df801eb05be27.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:12:02 np0005601978 systemd[1]: 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089-3e9df801eb05be27.service: Failed with result 'exit-code'.
Jan 30 04:12:07 np0005601978 kernel: SELinux:  Converting 2766 SID table entries...
Jan 30 04:12:07 np0005601978 kernel: SELinux:  policy capability network_peer_controls=1
Jan 30 04:12:07 np0005601978 kernel: SELinux:  policy capability open_perms=1
Jan 30 04:12:07 np0005601978 kernel: SELinux:  policy capability extended_socket_class=1
Jan 30 04:12:07 np0005601978 kernel: SELinux:  policy capability always_check_network=0
Jan 30 04:12:07 np0005601978 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 30 04:12:07 np0005601978 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 30 04:12:07 np0005601978 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 30 04:12:08 np0005601978 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Jan 30 04:12:08 np0005601978 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 30 04:12:08 np0005601978 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.015 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:12:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:09.017 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:12:09 np0005601978 ovn_controller[95419]: 2026-01-30T09:12:09Z|00049|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 30 04:12:16 np0005601978 systemd[1]: Stopping OpenSSH server daemon...
Jan 30 04:12:16 np0005601978 systemd[1]: sshd.service: Deactivated successfully.
Jan 30 04:12:16 np0005601978 systemd[1]: Stopped OpenSSH server daemon.
Jan 30 04:12:16 np0005601978 systemd[1]: sshd.service: Consumed 1.201s CPU time, read 32.0K from disk, written 0B to disk.
Jan 30 04:12:16 np0005601978 systemd[1]: Stopped target sshd-keygen.target.
Jan 30 04:12:16 np0005601978 systemd[1]: Stopping sshd-keygen.target...
Jan 30 04:12:16 np0005601978 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 30 04:12:16 np0005601978 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 30 04:12:16 np0005601978 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 30 04:12:16 np0005601978 systemd[1]: Reached target sshd-keygen.target.
Jan 30 04:12:16 np0005601978 systemd[1]: Starting OpenSSH server daemon...
Jan 30 04:12:16 np0005601978 systemd[1]: Started OpenSSH server daemon.
Jan 30 04:12:18 np0005601978 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:12:18 np0005601978 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:12:18 np0005601978 systemd[1]: Reloading.
Jan 30 04:12:18 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:18 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:18 np0005601978 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:12:24 np0005601978 python3.9[136602]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:12:24 np0005601978 systemd[1]: Reloading.
Jan 30 04:12:24 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:24 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:25 np0005601978 python3.9[138457]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:12:26 np0005601978 systemd[1]: Reloading.
Jan 30 04:12:26 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:26 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:26 np0005601978 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:12:26 np0005601978 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:12:26 np0005601978 systemd[1]: man-db-cache-update.service: Consumed 7.749s CPU time.
Jan 30 04:12:26 np0005601978 systemd[1]: run-rd238142d4e364df29fd91fecb79b95b0.service: Deactivated successfully.
Jan 30 04:12:26 np0005601978 python3.9[138664]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:12:28 np0005601978 systemd[1]: Reloading.
Jan 30 04:12:28 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:28 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:28 np0005601978 python3.9[138853]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:12:28 np0005601978 systemd[1]: Reloading.
Jan 30 04:12:29 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:29 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:29 np0005601978 podman[139015]: 2026-01-30 09:12:29.984273122 +0000 UTC m=+0.070365915 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 30 04:12:30 np0005601978 python3.9[139055]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:30 np0005601978 systemd[1]: Reloading.
Jan 30 04:12:30 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:30 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:31 np0005601978 python3.9[139252]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:31 np0005601978 systemd[1]: Reloading.
Jan 30 04:12:31 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:31 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:32 np0005601978 python3.9[139441]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:32 np0005601978 systemd[1]: Reloading.
Jan 30 04:12:32 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:32 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:32 np0005601978 ovn_controller[95419]: 2026-01-30T09:12:32Z|00050|chassis|WARN|Dropped 4 log messages in last 32 seconds (most recently, 24 seconds ago) due to excessive rate
Jan 30 04:12:32 np0005601978 ovn_controller[95419]: 2026-01-30T09:12:32Z|00051|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:12:32 np0005601978 podman[139480]: 2026-01-30 09:12:32.77089063 +0000 UTC m=+0.096241893 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_controller)
Jan 30 04:12:33 np0005601978 python3.9[139655]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:34 np0005601978 python3.9[139810]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:34 np0005601978 systemd[1]: Reloading.
Jan 30 04:12:34 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:34 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:35 np0005601978 python3.9[140002]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 30 04:12:35 np0005601978 systemd[1]: Reloading.
Jan 30 04:12:35 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:12:35 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:12:35 np0005601978 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 30 04:12:35 np0005601978 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 30 04:12:36 np0005601978 python3.9[140195]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:37 np0005601978 python3.9[140350]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:38 np0005601978 python3.9[140505]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:39 np0005601978 python3.9[140660]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:39 np0005601978 python3.9[140815]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:40 np0005601978 python3.9[140970]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:41 np0005601978 python3.9[141125]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:42 np0005601978 python3.9[141280]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:43 np0005601978 python3.9[141435]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:43 np0005601978 python3.9[141590]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:44 np0005601978 python3.9[141745]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:45 np0005601978 python3.9[141900]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:46 np0005601978 python3.9[142055]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:47 np0005601978 python3.9[142210]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 30 04:12:49 np0005601978 python3.9[142365]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:12:50 np0005601978 python3.9[142517]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:12:51 np0005601978 python3.9[142669]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:12:51 np0005601978 python3.9[142821]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:12:52 np0005601978 python3.9[142973]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:12:52 np0005601978 python3.9[143125]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:12:53 np0005601978 python3.9[143275]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:12:54 np0005601978 python3.9[143427]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:12:55 np0005601978 python3.9[143552]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764374.0951726-1642-130276668597708/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:12:56 np0005601978 python3.9[143704]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:12:56 np0005601978 python3.9[143829]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764375.8016806-1642-73233445676681/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:12:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:57.307 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:12:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:57.309 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:12:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:57.309 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:12:57 np0005601978 python3.9[143981]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:12:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:57.523 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:12:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:57.523 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:12:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:57.525 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:12:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:57.525 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:12:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:57.525 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:12:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:57.525 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:12:57 np0005601978 python3.9[144106]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764376.9519577-1642-80898192275117/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:12:58 np0005601978 python3.9[144258]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:12:59 np0005601978 python3.9[144383]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764378.1791165-1642-62385821586232/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:12:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:59.536 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:12:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:59.536 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:12:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:59.544 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:12:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:12:59.544 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:12:59 np0005601978 python3.9[144535]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:00 np0005601978 podman[144632]: 2026-01-30 09:13:00.354409253 +0000 UTC m=+0.088464086 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:13:00 np0005601978 python3.9[144676]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764379.4477491-1642-202937902645189/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:01 np0005601978 python3.9[144831]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:13:01Z|00052|chassis|WARN|Dropped 2 log messages in last 28 seconds (most recently, 28 seconds ago) due to excessive rate
Jan 30 04:13:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:13:01Z|00053|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:13:01 np0005601978 python3.9[144956]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764380.599528-1642-136033087847651/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:02 np0005601978 python3.9[145108]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:02 np0005601978 python3.9[145231]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764381.6040168-1642-260584027566172/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:02 np0005601978 podman[145355]: 2026-01-30 09:13:02.967374341 +0000 UTC m=+0.079500043 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:13:03 np0005601978 python3.9[145398]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:03 np0005601978 python3.9[145532]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769764382.6725209-1642-224781027019949/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:04 np0005601978 python3.9[145684]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 30 04:13:05 np0005601978 python3.9[145837]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:06 np0005601978 python3.9[145989]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:06 np0005601978 python3.9[146141]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:07 np0005601978 python3.9[146293]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:08 np0005601978 python3.9[146445]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:08 np0005601978 python3.9[146597]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:09 np0005601978 python3.9[146749]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:09 np0005601978 python3.9[146901]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:10 np0005601978 python3.9[147053]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:10 np0005601978 python3.9[147205]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:11 np0005601978 python3.9[147357]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:11 np0005601978 python3.9[147509]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:12 np0005601978 python3.9[147661]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:13 np0005601978 python3.9[147813]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:14 np0005601978 python3.9[147965]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:15 np0005601978 python3.9[148088]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764393.8917587-2305-61073567465526/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:16 np0005601978 python3.9[148240]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:16 np0005601978 python3.9[148363]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764395.2833219-2305-174173919393510/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:17 np0005601978 python3.9[148515]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:17 np0005601978 python3.9[148638]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764396.7862399-2305-154415775378505/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:18 np0005601978 python3.9[148790]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:18 np0005601978 python3.9[148913]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764397.8418872-2305-189779969793033/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:19 np0005601978 python3.9[149065]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:20 np0005601978 python3.9[149188]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764398.9173152-2305-123663905048098/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:20 np0005601978 python3.9[149340]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:21 np0005601978 python3.9[149463]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764400.2257082-2305-21776218401572/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:21 np0005601978 python3.9[149615]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:22 np0005601978 python3.9[149738]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764401.4623418-2305-52024877247150/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:23 np0005601978 python3.9[149890]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:23 np0005601978 python3.9[150013]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764402.6943548-2305-276635693035577/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:24 np0005601978 python3.9[150165]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:25 np0005601978 python3.9[150288]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764403.8563123-2305-8202674026499/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:25 np0005601978 python3.9[150440]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:26 np0005601978 python3.9[150563]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764405.1763434-2305-213743592718728/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:26 np0005601978 python3.9[150715]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:27 np0005601978 python3.9[150838]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764406.400241-2305-3839156721955/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:27 np0005601978 python3.9[150990]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:28 np0005601978 python3.9[151113]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764407.493551-2305-15447025826474/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:29 np0005601978 python3.9[151265]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:29 np0005601978 python3.9[151388]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764408.7179031-2305-227124700451431/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:30 np0005601978 python3.9[151540]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:30 np0005601978 podman[151635]: 2026-01-30 09:13:30.886612826 +0000 UTC m=+0.073046220 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 30 04:13:31 np0005601978 python3.9[151682]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764409.9696124-2305-107859956659991/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:31 np0005601978 python3.9[151833]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:13:32 np0005601978 python3.9[151988]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 30 04:13:33 np0005601978 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 30 04:13:33 np0005601978 ovn_controller[95419]: 2026-01-30T09:13:33Z|00054|chassis|WARN|Dropped 5 log messages in last 32 seconds (most recently, 24 seconds ago) due to excessive rate
Jan 30 04:13:33 np0005601978 ovn_controller[95419]: 2026-01-30T09:13:33Z|00055|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:13:33 np0005601978 podman[151990]: 2026-01-30 09:13:33.47533824 +0000 UTC m=+0.118359079 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 30 04:13:34 np0005601978 python3.9[152171]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:35 np0005601978 python3.9[152323]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:36 np0005601978 python3.9[152475]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:36 np0005601978 python3.9[152627]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:37 np0005601978 python3.9[152779]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:38 np0005601978 python3.9[152931]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:38 np0005601978 python3.9[153083]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:39 np0005601978 python3.9[153235]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:40 np0005601978 python3.9[153387]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:40 np0005601978 python3.9[153539]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:41 np0005601978 python3.9[153691]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:13:41 np0005601978 systemd[1]: Reloading.
Jan 30 04:13:42 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:13:42 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:13:42 np0005601978 systemd[1]: Starting libvirt logging daemon socket...
Jan 30 04:13:42 np0005601978 systemd[1]: Listening on libvirt logging daemon socket.
Jan 30 04:13:42 np0005601978 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 30 04:13:42 np0005601978 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 30 04:13:42 np0005601978 systemd[1]: Starting libvirt logging daemon...
Jan 30 04:13:42 np0005601978 systemd[1]: Started libvirt logging daemon.
Jan 30 04:13:43 np0005601978 python3.9[153884]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:13:43 np0005601978 systemd[1]: Reloading.
Jan 30 04:13:43 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:13:43 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:13:43 np0005601978 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 30 04:13:43 np0005601978 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 30 04:13:43 np0005601978 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 30 04:13:43 np0005601978 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 30 04:13:43 np0005601978 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 30 04:13:43 np0005601978 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 30 04:13:43 np0005601978 systemd[1]: Starting libvirt nodedev daemon...
Jan 30 04:13:43 np0005601978 systemd[1]: Started libvirt nodedev daemon.
Jan 30 04:13:44 np0005601978 python3.9[154099]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:13:44 np0005601978 systemd[1]: Reloading.
Jan 30 04:13:44 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:13:44 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:13:44 np0005601978 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 30 04:13:44 np0005601978 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 30 04:13:44 np0005601978 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 30 04:13:44 np0005601978 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 30 04:13:44 np0005601978 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 30 04:13:44 np0005601978 systemd[1]: Starting libvirt proxy daemon...
Jan 30 04:13:44 np0005601978 systemd[1]: Started libvirt proxy daemon.
Jan 30 04:13:44 np0005601978 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 30 04:13:44 np0005601978 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 30 04:13:44 np0005601978 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 30 04:13:45 np0005601978 python3.9[154318]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:13:45 np0005601978 systemd[1]: Reloading.
Jan 30 04:13:45 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:13:45 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:13:45 np0005601978 systemd[1]: Listening on libvirt locking daemon socket.
Jan 30 04:13:45 np0005601978 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 30 04:13:45 np0005601978 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 30 04:13:45 np0005601978 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 30 04:13:45 np0005601978 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 30 04:13:45 np0005601978 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 30 04:13:45 np0005601978 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 30 04:13:45 np0005601978 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 30 04:13:45 np0005601978 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 30 04:13:45 np0005601978 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 30 04:13:45 np0005601978 systemd[1]: Starting libvirt QEMU daemon...
Jan 30 04:13:45 np0005601978 systemd[1]: Started libvirt QEMU daemon.
Jan 30 04:13:45 np0005601978 setroubleshoot[154136]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 4e99dad9-423f-4b16-8234-8e2b9670335b
Jan 30 04:13:45 np0005601978 setroubleshoot[154136]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 30 04:13:45 np0005601978 setroubleshoot[154136]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 4e99dad9-423f-4b16-8234-8e2b9670335b
Jan 30 04:13:45 np0005601978 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:13:45 np0005601978 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:13:45 np0005601978 setroubleshoot[154136]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 30 04:13:46 np0005601978 python3.9[154536]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:13:46 np0005601978 systemd[1]: Reloading.
Jan 30 04:13:46 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:13:46 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:13:46 np0005601978 systemd[1]: Starting libvirt secret daemon socket...
Jan 30 04:13:46 np0005601978 systemd[1]: Listening on libvirt secret daemon socket.
Jan 30 04:13:46 np0005601978 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 30 04:13:46 np0005601978 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 30 04:13:46 np0005601978 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 30 04:13:46 np0005601978 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 30 04:13:46 np0005601978 systemd[1]: Starting libvirt secret daemon...
Jan 30 04:13:46 np0005601978 systemd[1]: Started libvirt secret daemon.
Jan 30 04:13:48 np0005601978 python3.9[154747]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:48 np0005601978 python3.9[154899]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 30 04:13:49 np0005601978 python3.9[155051]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:50 np0005601978 python3.9[155174]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764429.497559-3340-116016159860947/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:51 np0005601978 python3.9[155326]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:51 np0005601978 python3.9[155478]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:52 np0005601978 python3.9[155556]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:52 np0005601978 python3.9[155708]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:53 np0005601978 python3.9[155786]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.5qysejbo recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:54 np0005601978 python3.9[155938]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:54 np0005601978 python3.9[156016]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:55 np0005601978 python3.9[156168]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:13:55 np0005601978 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 30 04:13:55 np0005601978 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 30 04:13:56 np0005601978 python3[156321]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 30 04:13:56 np0005601978 python3.9[156473]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:57 np0005601978 python3.9[156551]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:13:57.311 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:13:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:13:57.312 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:13:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:13:57.313 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:13:57 np0005601978 python3.9[156703]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:58 np0005601978 python3.9[156828]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764437.470454-3607-3528257308956/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:13:59 np0005601978 python3.9[156980]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:13:59 np0005601978 python3.9[157058]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:00 np0005601978 python3.9[157210]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:00 np0005601978 python3.9[157288]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:14:01Z|00056|chassis|WARN|Dropped 1 log messages in last 28 seconds (most recently, 28 seconds ago) due to excessive rate
Jan 30 04:14:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:14:01Z|00057|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:14:01 np0005601978 podman[157365]: 2026-01-30 09:14:01.420707881 +0000 UTC m=+0.078319406 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Jan 30 04:14:01 np0005601978 python3.9[157459]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:02 np0005601978 python3.9[157584]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764441.178724-3724-9197808089400/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:02 np0005601978 python3.9[157736]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:03 np0005601978 python3.9[157888]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:14:04 np0005601978 podman[158015]: 2026-01-30 09:14:04.29772173 +0000 UTC m=+0.108682984 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 30 04:14:04 np0005601978 python3.9[158060]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:05 np0005601978 python3.9[158220]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:14:05 np0005601978 python3.9[158373]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:14:06 np0005601978 python3.9[158527]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:14:07 np0005601978 python3.9[158682]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:08 np0005601978 python3.9[158834]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:08 np0005601978 python3.9[158957]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764447.6660862-3940-120644672771748/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:09 np0005601978 python3.9[159109]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:09 np0005601978 python3.9[159232]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764448.919021-3985-156260207801655/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:10 np0005601978 python3.9[159384]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:11 np0005601978 python3.9[159507]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764450.1716068-4030-160201122026962/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:11 np0005601978 python3.9[159659]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:14:11 np0005601978 systemd[1]: Reloading.
Jan 30 04:14:12 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:14:12 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:14:12 np0005601978 systemd[1]: Reached target edpm_libvirt.target.
Jan 30 04:14:12 np0005601978 python3.9[159851]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 30 04:14:13 np0005601978 systemd[1]: Reloading.
Jan 30 04:14:13 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:14:13 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:14:13 np0005601978 systemd[1]: Reloading.
Jan 30 04:14:13 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:14:13 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:14:13 np0005601978 systemd[1]: session-24.scope: Deactivated successfully.
Jan 30 04:14:13 np0005601978 systemd[1]: session-24.scope: Consumed 3min 9.348s CPU time.
Jan 30 04:14:13 np0005601978 systemd-logind[793]: Session 24 logged out. Waiting for processes to exit.
Jan 30 04:14:13 np0005601978 systemd-logind[793]: Removed session 24.
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.022 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:14:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:17.024 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:14:20 np0005601978 systemd-logind[793]: New session 25 of user zuul.
Jan 30 04:14:20 np0005601978 systemd[1]: Started Session 25 of User zuul.
Jan 30 04:14:21 np0005601978 python3.9[160099]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:14:22 np0005601978 python3.9[160253]: ansible-ansible.builtin.service_facts Invoked
Jan 30 04:14:22 np0005601978 network[160270]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 30 04:14:22 np0005601978 network[160271]: 'network-scripts' will be removed from distribution in near future.
Jan 30 04:14:22 np0005601978 network[160272]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 30 04:14:28 np0005601978 python3.9[160543]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 30 04:14:29 np0005601978 python3.9[160627]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:14:32 np0005601978 podman[160631]: 2026-01-30 09:14:32.397149164 +0000 UTC m=+0.056874406 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:14:34 np0005601978 ovn_controller[95419]: 2026-01-30T09:14:34Z|00058|chassis|WARN|Dropped 4 log messages in last 30 seconds (most recently, 25 seconds ago) due to excessive rate
Jan 30 04:14:34 np0005601978 ovn_controller[95419]: 2026-01-30T09:14:34Z|00059|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:14:34 np0005601978 podman[160674]: 2026-01-30 09:14:34.421775869 +0000 UTC m=+0.081347393 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 30 04:14:34 np0005601978 python3.9[160826]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:14:35 np0005601978 python3.9[160978]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:14:36 np0005601978 python3.9[161131]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:14:37 np0005601978 python3.9[161283]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:14:38 np0005601978 python3.9[161436]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:38 np0005601978 python3.9[161559]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764477.5751479-241-195403940498410/.source.iscsi _original_basename=.uec1yqbb follow=False checksum=9472cd687e1fbda18c191be605bfb98bca78656a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:39 np0005601978 python3.9[161711]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:40 np0005601978 python3.9[161863]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:41 np0005601978 python3.9[162015]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:14:41 np0005601978 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 30 04:14:42 np0005601978 python3.9[162171]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:14:42 np0005601978 systemd[1]: Reloading.
Jan 30 04:14:42 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:14:42 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:14:42 np0005601978 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 30 04:14:42 np0005601978 systemd[1]: Starting Open-iSCSI...
Jan 30 04:14:42 np0005601978 kernel: Loading iSCSI transport class v2.0-870.
Jan 30 04:14:42 np0005601978 systemd[1]: Started Open-iSCSI.
Jan 30 04:14:42 np0005601978 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 30 04:14:42 np0005601978 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 30 04:14:43 np0005601978 python3.9[162370]: ansible-ansible.builtin.service_facts Invoked
Jan 30 04:14:43 np0005601978 network[162387]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 30 04:14:43 np0005601978 network[162388]: 'network-scripts' will be removed from distribution in near future.
Jan 30 04:14:43 np0005601978 network[162389]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 30 04:14:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:45.171 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:14:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:45.172 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:14:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:45.173 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:14:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:45.174 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:14:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:46.190 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:14:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:46.190 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:14:47 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:47.192 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:14:47 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:47.192 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:14:47 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:47.192 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:14:47 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:47.192 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:14:47 np0005601978 python3.9[162660]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:14:49 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:49.203 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:14:49 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:49.203 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:14:50 np0005601978 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:14:50 np0005601978 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:14:50 np0005601978 systemd[1]: Reloading.
Jan 30 04:14:50 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:14:50 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:14:50 np0005601978 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:14:51 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:51.206 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:14:51 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:51.206 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:14:51 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:51.207 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:14:51 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:51.207 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:14:51 np0005601978 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:14:51 np0005601978 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:14:51 np0005601978 systemd[1]: run-r8209fa9c66dd4095a7b7963cb7557568.service: Deactivated successfully.
Jan 30 04:14:52 np0005601978 python3.9[162976]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 30 04:14:53 np0005601978 python3.9[163128]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 30 04:14:54 np0005601978 python3.9[163284]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:54 np0005601978 python3.9[163407]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764493.725203-505-269708724735803/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:55 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:55.220 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:14:55 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:55.220 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:14:55 np0005601978 python3.9[163559]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:14:56 np0005601978 python3.9[163711]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:14:56 np0005601978 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 30 04:14:56 np0005601978 systemd[1]: Stopped Load Kernel Modules.
Jan 30 04:14:56 np0005601978 systemd[1]: Stopping Load Kernel Modules...
Jan 30 04:14:56 np0005601978 systemd[1]: Starting Load Kernel Modules...
Jan 30 04:14:56 np0005601978 systemd[1]: Finished Load Kernel Modules.
Jan 30 04:14:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:57.313 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:14:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:57.314 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:14:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:57.314 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:14:57 np0005601978 python3.9[163867]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:14:58 np0005601978 python3.9[164020]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:14:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:59.222 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:14:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:59.222 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:14:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:59.225 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:14:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:14:59.225 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:14:59 np0005601978 python3.9[164172]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:14:59 np0005601978 python3.9[164295]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764498.8494458-658-126917002243799/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:00 np0005601978 python3.9[164447]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:15:01Z|00060|chassis|WARN|Dropped 1 log messages in last 27 seconds (most recently, 27 seconds ago) due to excessive rate
Jan 30 04:15:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:15:01Z|00061|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:15:01 np0005601978 python3.9[164600]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:02 np0005601978 python3.9[164752]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:02 np0005601978 podman[164876]: 2026-01-30 09:15:02.661230601 +0000 UTC m=+0.063969406 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 30 04:15:02 np0005601978 python3.9[164919]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:03 np0005601978 python3.9[165074]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:04 np0005601978 python3.9[165226]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:04 np0005601978 podman[165350]: 2026-01-30 09:15:04.644301642 +0000 UTC m=+0.089287776 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 30 04:15:04 np0005601978 python3.9[165396]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:05 np0005601978 python3.9[165554]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:05 np0005601978 python3.9[165706]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:15:06 np0005601978 python3.9[165860]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:07 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:15:07.242 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:15:07 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:15:07.245 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:15:07 np0005601978 python3.9[166013]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:07 np0005601978 systemd[1]: Listening on multipathd control socket.
Jan 30 04:15:08 np0005601978 python3.9[166169]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:08 np0005601978 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 30 04:15:08 np0005601978 udevadm[166174]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 30 04:15:08 np0005601978 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 30 04:15:08 np0005601978 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 30 04:15:08 np0005601978 multipathd[166178]: --------start up--------
Jan 30 04:15:08 np0005601978 multipathd[166178]: read /etc/multipath.conf
Jan 30 04:15:08 np0005601978 multipathd[166178]: path checkers start up
Jan 30 04:15:08 np0005601978 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 30 04:15:10 np0005601978 python3.9[166337]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 30 04:15:10 np0005601978 python3.9[166489]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 30 04:15:10 np0005601978 kernel: Key type psk registered
Jan 30 04:15:11 np0005601978 python3.9[166650]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:15:12 np0005601978 python3.9[166773]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764511.1739259-1048-51774182258226/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:12 np0005601978 python3.9[166925]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:13 np0005601978 python3.9[167077]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:15:13 np0005601978 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 30 04:15:13 np0005601978 systemd[1]: Stopped Load Kernel Modules.
Jan 30 04:15:13 np0005601978 systemd[1]: Stopping Load Kernel Modules...
Jan 30 04:15:13 np0005601978 systemd[1]: Starting Load Kernel Modules...
Jan 30 04:15:13 np0005601978 systemd[1]: Finished Load Kernel Modules.
Jan 30 04:15:14 np0005601978 python3.9[167233]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 30 04:15:16 np0005601978 systemd[1]: Reloading.
Jan 30 04:15:17 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:15:17 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:15:17 np0005601978 systemd[1]: Reloading.
Jan 30 04:15:17 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:15:17 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:15:17 np0005601978 systemd-logind[793]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 30 04:15:17 np0005601978 systemd-logind[793]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 30 04:15:17 np0005601978 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 30 04:15:17 np0005601978 systemd[1]: Starting man-db-cache-update.service...
Jan 30 04:15:17 np0005601978 systemd[1]: Reloading.
Jan 30 04:15:17 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:15:17 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:15:18 np0005601978 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 30 04:15:19 np0005601978 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 30 04:15:19 np0005601978 systemd[1]: Finished man-db-cache-update.service.
Jan 30 04:15:19 np0005601978 systemd[1]: man-db-cache-update.service: Consumed 1.344s CPU time.
Jan 30 04:15:19 np0005601978 systemd[1]: run-r65238be1788c481f8020cdfb7f453030.service: Deactivated successfully.
Jan 30 04:15:20 np0005601978 python3.9[168698]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:15:20 np0005601978 systemd[1]: Stopping Open-iSCSI...
Jan 30 04:15:20 np0005601978 iscsid[162211]: iscsid shutting down.
Jan 30 04:15:20 np0005601978 systemd[1]: iscsid.service: Deactivated successfully.
Jan 30 04:15:20 np0005601978 systemd[1]: Stopped Open-iSCSI.
Jan 30 04:15:20 np0005601978 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 30 04:15:20 np0005601978 systemd[1]: Starting Open-iSCSI...
Jan 30 04:15:20 np0005601978 systemd[1]: Started Open-iSCSI.
Jan 30 04:15:21 np0005601978 python3.9[168855]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:15:21 np0005601978 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 30 04:15:21 np0005601978 multipathd[166178]: exit (signal)
Jan 30 04:15:21 np0005601978 multipathd[166178]: --------shut down-------
Jan 30 04:15:21 np0005601978 systemd[1]: multipathd.service: Deactivated successfully.
Jan 30 04:15:21 np0005601978 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 30 04:15:21 np0005601978 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 30 04:15:21 np0005601978 multipathd[168861]: --------start up--------
Jan 30 04:15:21 np0005601978 multipathd[168861]: read /etc/multipath.conf
Jan 30 04:15:21 np0005601978 multipathd[168861]: path checkers start up
Jan 30 04:15:21 np0005601978 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 30 04:15:22 np0005601978 python3.9[169018]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:15:23 np0005601978 python3.9[169174]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:24 np0005601978 python3.9[169326]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:15:24 np0005601978 systemd[1]: Reloading.
Jan 30 04:15:24 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:15:24 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:15:25 np0005601978 python3.9[169511]: ansible-ansible.builtin.service_facts Invoked
Jan 30 04:15:25 np0005601978 network[169528]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 30 04:15:25 np0005601978 network[169529]: 'network-scripts' will be removed from distribution in near future.
Jan 30 04:15:25 np0005601978 network[169530]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 30 04:15:29 np0005601978 python3.9[169802]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:30 np0005601978 python3.9[169955]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:31 np0005601978 python3.9[170108]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:31 np0005601978 python3.9[170261]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:32 np0005601978 python3.9[170414]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:32 np0005601978 podman[170416]: 2026-01-30 09:15:32.821982379 +0000 UTC m=+0.067850910 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:15:33 np0005601978 python3.9[170585]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:34 np0005601978 python3.9[170738]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:34 np0005601978 python3.9[170891]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:15:35 np0005601978 ovn_controller[95419]: 2026-01-30T09:15:35Z|00062|chassis|WARN|Dropped 4 log messages in last 30 seconds (most recently, 26 seconds ago) due to excessive rate
Jan 30 04:15:35 np0005601978 ovn_controller[95419]: 2026-01-30T09:15:35Z|00063|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:15:35 np0005601978 podman[170893]: 2026-01-30 09:15:35.081329401 +0000 UTC m=+0.095741749 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:15:37 np0005601978 python3.9[171070]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:38 np0005601978 python3.9[171222]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:38 np0005601978 python3.9[171374]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:39 np0005601978 python3.9[171526]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:40 np0005601978 python3.9[171678]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:40 np0005601978 python3.9[171830]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:41 np0005601978 python3.9[171982]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:42 np0005601978 python3.9[172134]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:42 np0005601978 python3.9[172286]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:43 np0005601978 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 30 04:15:43 np0005601978 python3.9[172439]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:44 np0005601978 python3.9[172591]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:44 np0005601978 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 30 04:15:44 np0005601978 python3.9[172744]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:45 np0005601978 python3.9[172896]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:45 np0005601978 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 30 04:15:45 np0005601978 python3.9[173049]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:46 np0005601978 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 30 04:15:46 np0005601978 python3.9[173201]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:47 np0005601978 python3.9[173354]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:15:48 np0005601978 python3.9[173506]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:49 np0005601978 python3.9[173658]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 30 04:15:50 np0005601978 python3.9[173810]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:15:50 np0005601978 systemd[1]: Reloading.
Jan 30 04:15:50 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:15:50 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:15:51 np0005601978 python3.9[173997]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:52 np0005601978 python3.9[174150]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:52 np0005601978 python3.9[174303]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:53 np0005601978 python3.9[174456]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:53 np0005601978 python3.9[174609]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:54 np0005601978 python3.9[174762]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:55 np0005601978 python3.9[174915]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:55 np0005601978 python3.9[175068]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:15:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:15:57.316 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:15:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:15:57.320 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:15:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:15:57.320 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:15:57 np0005601978 python3.9[175221]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:15:58 np0005601978 python3.9[175373]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:15:59 np0005601978 python3.9[175525]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:15:59 np0005601978 python3.9[175677]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:00 np0005601978 python3.9[175829]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:00 np0005601978 python3.9[175981]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:16:01Z|00064|chassis|WARN|Dropped 1 log messages in last 26 seconds (most recently, 26 seconds ago) due to excessive rate
Jan 30 04:16:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:16:01Z|00065|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:16:01 np0005601978 python3.9[176133]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:02 np0005601978 python3.9[176285]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:02 np0005601978 python3.9[176437]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:03 np0005601978 podman[176562]: 2026-01-30 09:16:03.095608628 +0000 UTC m=+0.055645466 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent)
Jan 30 04:16:03 np0005601978 python3.9[176607]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:05 np0005601978 podman[176632]: 2026-01-30 09:16:05.443489152 +0000 UTC m=+0.098421419 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:16:09 np0005601978 python3.9[176785]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 30 04:16:10 np0005601978 python3.9[176938]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 30 04:16:12 np0005601978 python3.9[177096]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 30 04:16:20 np0005601978 systemd-logind[793]: New session 26 of user zuul.
Jan 30 04:16:20 np0005601978 systemd[1]: Started Session 26 of User zuul.
Jan 30 04:16:20 np0005601978 systemd[1]: session-26.scope: Deactivated successfully.
Jan 30 04:16:20 np0005601978 systemd-logind[793]: Session 26 logged out. Waiting for processes to exit.
Jan 30 04:16:20 np0005601978 systemd-logind[793]: Removed session 26.
Jan 30 04:16:20 np0005601978 python3.9[177282]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:21 np0005601978 python3.9[177403]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764580.5164027-2636-19617789967481/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:22 np0005601978 python3.9[177553]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:22 np0005601978 python3.9[177629]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:22 np0005601978 python3.9[177779]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:23 np0005601978 python3.9[177900]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764582.5431714-2636-72823013720491/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:23 np0005601978 python3.9[178050]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:24 np0005601978 python3.9[178171]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764583.442284-2636-63352182071590/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:24 np0005601978 python3.9[178321]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:25 np0005601978 python3.9[178442]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764584.3838694-2636-224006310564569/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:25 np0005601978 python3.9[178592]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:26 np0005601978 python3.9[178713]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764585.3188024-2636-44454661392369/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:27 np0005601978 python3.9[178865]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:16:28 np0005601978 python3.9[179017]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:16:29 np0005601978 python3.9[179169]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:16:29 np0005601978 python3.9[179321]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:30 np0005601978 python3.9[179444]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769764589.247821-2957-219241736023476/.source _original_basename=.nfvon6xh follow=False checksum=b9ec7244731af6d3462e359c9b04ff36ecdfc0c8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 30 04:16:31 np0005601978 python3.9[179596]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:16:31 np0005601978 python3.9[179748]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:32 np0005601978 python3.9[179869]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764591.3712103-3034-149590431452985/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:32.626 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:16:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:32.627 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:16:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:32.630 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:16:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:32.630 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:16:33 np0005601978 python3.9[180019]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:16:33 np0005601978 podman[180114]: 2026-01-30 09:16:33.398060337 +0000 UTC m=+0.081153841 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 30 04:16:33 np0005601978 python3.9[180151]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764592.6680164-3079-174948933021222/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:16:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:33.648 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:16:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:33.649 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:16:34 np0005601978 python3.9[180310]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 30 04:16:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:34.649 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:16:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:34.650 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:16:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:34.650 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:16:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:34.650 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:16:35 np0005601978 python3.9[180462]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 30 04:16:36 np0005601978 ovn_controller[95419]: 2026-01-30T09:16:36Z|00066|chassis|WARN|Dropped 4 log messages in last 31 seconds (most recently, 27 seconds ago) due to excessive rate
Jan 30 04:16:36 np0005601978 ovn_controller[95419]: 2026-01-30T09:16:36Z|00067|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:16:36 np0005601978 podman[180562]: 2026-01-30 09:16:36.438170744 +0000 UTC m=+0.093510212 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 30 04:16:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:36.655 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:16:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:36.655 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:16:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:36.659 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:16:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:36.661 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:16:36 np0005601978 python3[180641]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 30 04:16:36 np0005601978 podman[180677]: 2026-01-30 09:16:36.877746868 +0000 UTC m=+0.047706605 container create 260572bd48844e2922e95be43c7c2bfb1819e82ed2539d450372638f0714f8d9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:16:36 np0005601978 podman[180677]: 2026-01-30 09:16:36.850786021 +0000 UTC m=+0.020745788 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 30 04:16:36 np0005601978 python3[180641]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 30 04:16:37 np0005601978 python3.9[180867]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:16:38 np0005601978 python3.9[181021]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 30 04:16:39 np0005601978 python3.9[181173]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 30 04:16:40 np0005601978 python3[181325]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 30 04:16:40 np0005601978 podman[181362]: 2026-01-30 09:16:40.734017977 +0000 UTC m=+0.054655785 container create ec5cfafb1b314d372d6dc277737089a5c440129a4243c25b19d016e4f4db4c0c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Jan 30 04:16:40 np0005601978 podman[181362]: 2026-01-30 09:16:40.708287549 +0000 UTC m=+0.028925407 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 30 04:16:40 np0005601978 python3[181325]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 30 04:16:41 np0005601978 python3.9[181552]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:16:42 np0005601978 python3.9[181706]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:16:43 np0005601978 python3.9[181857]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769764602.5323877-3367-272914428584591/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:16:43 np0005601978 python3.9[181933]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:16:43 np0005601978 systemd[1]: Reloading.
Jan 30 04:16:43 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:16:43 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:16:44 np0005601978 python3.9[182044]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:16:44 np0005601978 systemd[1]: Reloading.
Jan 30 04:16:44 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:16:44 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:16:44 np0005601978 systemd[1]: Starting nova_compute container...
Jan 30 04:16:45 np0005601978 systemd[1]: Started libcrun container.
Jan 30 04:16:45 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f61e71058045d43e94e6e72a6fa6f7793a59d5d82c8f3ed8632dc1c2709cdc8/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:45 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f61e71058045d43e94e6e72a6fa6f7793a59d5d82c8f3ed8632dc1c2709cdc8/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:45 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f61e71058045d43e94e6e72a6fa6f7793a59d5d82c8f3ed8632dc1c2709cdc8/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:45 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f61e71058045d43e94e6e72a6fa6f7793a59d5d82c8f3ed8632dc1c2709cdc8/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:45 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f61e71058045d43e94e6e72a6fa6f7793a59d5d82c8f3ed8632dc1c2709cdc8/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:45 np0005601978 podman[182084]: 2026-01-30 09:16:45.082997595 +0000 UTC m=+0.167332913 container init ec5cfafb1b314d372d6dc277737089a5c440129a4243c25b19d016e4f4db4c0c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:16:45 np0005601978 podman[182084]: 2026-01-30 09:16:45.090064207 +0000 UTC m=+0.174399485 container start ec5cfafb1b314d372d6dc277737089a5c440129a4243c25b19d016e4f4db4c0c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 30 04:16:45 np0005601978 nova_compute[182099]: + sudo -E kolla_set_configs
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Validating config file
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Copying service configuration files
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Deleting /etc/ceph
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Creating directory /etc/ceph
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Setting permission for /etc/ceph
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Writing out command to execute
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:45 np0005601978 nova_compute[182099]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 30 04:16:45 np0005601978 nova_compute[182099]: ++ cat /run_command
Jan 30 04:16:45 np0005601978 nova_compute[182099]: + CMD=nova-compute
Jan 30 04:16:45 np0005601978 nova_compute[182099]: + ARGS=
Jan 30 04:16:45 np0005601978 nova_compute[182099]: + sudo kolla_copy_cacerts
Jan 30 04:16:45 np0005601978 nova_compute[182099]: + [[ ! -n '' ]]
Jan 30 04:16:45 np0005601978 nova_compute[182099]: + . kolla_extend_start
Jan 30 04:16:45 np0005601978 nova_compute[182099]: Running command: 'nova-compute'
Jan 30 04:16:45 np0005601978 nova_compute[182099]: + echo 'Running command: '\''nova-compute'\'''
Jan 30 04:16:45 np0005601978 nova_compute[182099]: + umask 0022
Jan 30 04:16:45 np0005601978 nova_compute[182099]: + exec nova-compute
Jan 30 04:16:45 np0005601978 podman[182084]: nova_compute
Jan 30 04:16:45 np0005601978 systemd[1]: Started nova_compute container.
Jan 30 04:16:46 np0005601978 python3.9[182261]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:16:47 np0005601978 nova_compute[182099]: 2026-01-30 09:16:47.056 182103 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 30 04:16:47 np0005601978 nova_compute[182099]: 2026-01-30 09:16:47.056 182103 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 30 04:16:47 np0005601978 nova_compute[182099]: 2026-01-30 09:16:47.057 182103 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 30 04:16:47 np0005601978 nova_compute[182099]: 2026-01-30 09:16:47.057 182103 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 30 04:16:47 np0005601978 nova_compute[182099]: 2026-01-30 09:16:47.171 182103 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:16:47 np0005601978 python3.9[182413]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:16:47 np0005601978 nova_compute[182099]: 2026-01-30 09:16:47.193 182103 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:16:47 np0005601978 nova_compute[182099]: 2026-01-30 09:16:47.194 182103 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 30 04:16:48 np0005601978 python3.9[182565]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:16:49 np0005601978 python3.9[182717]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 30 04:16:49 np0005601978 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:16:49 np0005601978 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:16:50 np0005601978 python3.9[182892]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 30 04:16:50 np0005601978 systemd[1]: Stopping nova_compute container...
Jan 30 04:16:50 np0005601978 systemd[1]: libpod-ec5cfafb1b314d372d6dc277737089a5c440129a4243c25b19d016e4f4db4c0c.scope: Deactivated successfully.
Jan 30 04:16:50 np0005601978 systemd[1]: libpod-ec5cfafb1b314d372d6dc277737089a5c440129a4243c25b19d016e4f4db4c0c.scope: Consumed 2.104s CPU time.
Jan 30 04:16:50 np0005601978 podman[182896]: 2026-01-30 09:16:50.251820846 +0000 UTC m=+0.052302038 container died ec5cfafb1b314d372d6dc277737089a5c440129a4243c25b19d016e4f4db4c0c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute)
Jan 30 04:16:50 np0005601978 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec5cfafb1b314d372d6dc277737089a5c440129a4243c25b19d016e4f4db4c0c-userdata-shm.mount: Deactivated successfully.
Jan 30 04:16:50 np0005601978 systemd[1]: var-lib-containers-storage-overlay-6f61e71058045d43e94e6e72a6fa6f7793a59d5d82c8f3ed8632dc1c2709cdc8-merged.mount: Deactivated successfully.
Jan 30 04:16:50 np0005601978 podman[182896]: 2026-01-30 09:16:50.305438063 +0000 UTC m=+0.105919245 container cleanup ec5cfafb1b314d372d6dc277737089a5c440129a4243c25b19d016e4f4db4c0c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:16:50 np0005601978 podman[182896]: nova_compute
Jan 30 04:16:50 np0005601978 podman[182926]: nova_compute
Jan 30 04:16:50 np0005601978 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 30 04:16:50 np0005601978 systemd[1]: Stopped nova_compute container.
Jan 30 04:16:50 np0005601978 systemd[1]: Starting nova_compute container...
Jan 30 04:16:50 np0005601978 systemd[1]: Started libcrun container.
Jan 30 04:16:50 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f61e71058045d43e94e6e72a6fa6f7793a59d5d82c8f3ed8632dc1c2709cdc8/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:50 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f61e71058045d43e94e6e72a6fa6f7793a59d5d82c8f3ed8632dc1c2709cdc8/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:50 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f61e71058045d43e94e6e72a6fa6f7793a59d5d82c8f3ed8632dc1c2709cdc8/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:50 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f61e71058045d43e94e6e72a6fa6f7793a59d5d82c8f3ed8632dc1c2709cdc8/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:50 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f61e71058045d43e94e6e72a6fa6f7793a59d5d82c8f3ed8632dc1c2709cdc8/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:50 np0005601978 podman[182939]: 2026-01-30 09:16:50.502718166 +0000 UTC m=+0.095814998 container init ec5cfafb1b314d372d6dc277737089a5c440129a4243c25b19d016e4f4db4c0c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251202, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm)
Jan 30 04:16:50 np0005601978 podman[182939]: 2026-01-30 09:16:50.507895403 +0000 UTC m=+0.100992225 container start ec5cfafb1b314d372d6dc277737089a5c440129a4243c25b19d016e4f4db4c0c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:16:50 np0005601978 nova_compute[182955]: + sudo -E kolla_set_configs
Jan 30 04:16:50 np0005601978 podman[182939]: nova_compute
Jan 30 04:16:50 np0005601978 systemd[1]: Started nova_compute container.
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Validating config file
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Copying service configuration files
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Deleting /etc/ceph
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Creating directory /etc/ceph
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Setting permission for /etc/ceph
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Writing out command to execute
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:50 np0005601978 nova_compute[182955]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 30 04:16:50 np0005601978 nova_compute[182955]: ++ cat /run_command
Jan 30 04:16:50 np0005601978 nova_compute[182955]: + CMD=nova-compute
Jan 30 04:16:50 np0005601978 nova_compute[182955]: + ARGS=
Jan 30 04:16:50 np0005601978 nova_compute[182955]: + sudo kolla_copy_cacerts
Jan 30 04:16:50 np0005601978 nova_compute[182955]: + [[ ! -n '' ]]
Jan 30 04:16:50 np0005601978 nova_compute[182955]: + . kolla_extend_start
Jan 30 04:16:50 np0005601978 nova_compute[182955]: Running command: 'nova-compute'
Jan 30 04:16:50 np0005601978 nova_compute[182955]: + echo 'Running command: '\''nova-compute'\'''
Jan 30 04:16:50 np0005601978 nova_compute[182955]: + umask 0022
Jan 30 04:16:50 np0005601978 nova_compute[182955]: + exec nova-compute
Jan 30 04:16:52 np0005601978 python3.9[183119]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 30 04:16:52 np0005601978 systemd[1]: Started libpod-conmon-260572bd48844e2922e95be43c7c2bfb1819e82ed2539d450372638f0714f8d9.scope.
Jan 30 04:16:52 np0005601978 systemd[1]: Started libcrun container.
Jan 30 04:16:52 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1adcf667e328910abad4e649838e353ee532cfdafdf73ab4a6ac093d7f89b396/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:52 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1adcf667e328910abad4e649838e353ee532cfdafdf73ab4a6ac093d7f89b396/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:52 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1adcf667e328910abad4e649838e353ee532cfdafdf73ab4a6ac093d7f89b396/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 30 04:16:52 np0005601978 podman[183144]: 2026-01-30 09:16:52.261247228 +0000 UTC m=+0.123653277 container init 260572bd48844e2922e95be43c7c2bfb1819e82ed2539d450372638f0714f8d9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, container_name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 30 04:16:52 np0005601978 podman[183144]: 2026-01-30 09:16:52.269968521 +0000 UTC m=+0.132374500 container start 260572bd48844e2922e95be43c7c2bfb1819e82ed2539d450372638f0714f8d9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.license=GPLv2, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 30 04:16:52 np0005601978 python3.9[183119]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 30 04:16:52 np0005601978 nova_compute[182955]: 2026-01-30 09:16:52.291 182959 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 30 04:16:52 np0005601978 nova_compute[182955]: 2026-01-30 09:16:52.291 182959 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 30 04:16:52 np0005601978 nova_compute[182955]: 2026-01-30 09:16:52.291 182959 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 30 04:16:52 np0005601978 nova_compute[182955]: 2026-01-30 09:16:52.291 182959 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 30 04:16:52 np0005601978 nova_compute_init[183167]: INFO:nova_statedir:Applying nova statedir ownership
Jan 30 04:16:52 np0005601978 nova_compute_init[183167]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 30 04:16:52 np0005601978 nova_compute_init[183167]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 30 04:16:52 np0005601978 nova_compute_init[183167]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 30 04:16:52 np0005601978 nova_compute_init[183167]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 30 04:16:52 np0005601978 nova_compute_init[183167]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 30 04:16:52 np0005601978 nova_compute_init[183167]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 30 04:16:52 np0005601978 nova_compute_init[183167]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 30 04:16:52 np0005601978 nova_compute_init[183167]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 30 04:16:52 np0005601978 nova_compute_init[183167]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 30 04:16:52 np0005601978 nova_compute_init[183167]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 30 04:16:52 np0005601978 nova_compute_init[183167]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 30 04:16:52 np0005601978 nova_compute_init[183167]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 30 04:16:52 np0005601978 nova_compute_init[183167]: INFO:nova_statedir:Nova statedir ownership complete
Jan 30 04:16:52 np0005601978 systemd[1]: libpod-260572bd48844e2922e95be43c7c2bfb1819e82ed2539d450372638f0714f8d9.scope: Deactivated successfully.
Jan 30 04:16:52 np0005601978 podman[183180]: 2026-01-30 09:16:52.35152436 +0000 UTC m=+0.025934053 container died 260572bd48844e2922e95be43c7c2bfb1819e82ed2539d450372638f0714f8d9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, container_name=nova_compute_init, org.label-schema.vendor=CentOS)
Jan 30 04:16:52 np0005601978 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-260572bd48844e2922e95be43c7c2bfb1819e82ed2539d450372638f0714f8d9-userdata-shm.mount: Deactivated successfully.
Jan 30 04:16:52 np0005601978 systemd[1]: var-lib-containers-storage-overlay-1adcf667e328910abad4e649838e353ee532cfdafdf73ab4a6ac093d7f89b396-merged.mount: Deactivated successfully.
Jan 30 04:16:52 np0005601978 podman[183180]: 2026-01-30 09:16:52.402833552 +0000 UTC m=+0.077243215 container cleanup 260572bd48844e2922e95be43c7c2bfb1819e82ed2539d450372638f0714f8d9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Jan 30 04:16:52 np0005601978 systemd[1]: libpod-conmon-260572bd48844e2922e95be43c7c2bfb1819e82ed2539d450372638f0714f8d9.scope: Deactivated successfully.
Jan 30 04:16:52 np0005601978 nova_compute[182955]: 2026-01-30 09:16:52.414 182959 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:16:52 np0005601978 nova_compute[182955]: 2026-01-30 09:16:52.424 182959 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:16:52 np0005601978 nova_compute[182955]: 2026-01-30 09:16:52.425 182959 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 30 04:16:52 np0005601978 systemd-logind[793]: Session 25 logged out. Waiting for processes to exit.
Jan 30 04:16:52 np0005601978 systemd[1]: session-25.scope: Deactivated successfully.
Jan 30 04:16:52 np0005601978 systemd[1]: session-25.scope: Consumed 1min 28.877s CPU time.
Jan 30 04:16:52 np0005601978 systemd-logind[793]: Removed session 25.
Jan 30 04:16:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:57.320 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:16:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:57.320 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:16:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:16:57.320 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:16:57 np0005601978 nova_compute[182955]: 2026-01-30 09:16:57.446 182959 ERROR oslo.messaging._drivers.impl_rabbit [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] Connection failed: timed out (retrying in 1.0 seconds): socket.timeout: timed out#033[00m
Jan 30 04:16:58 np0005601978 systemd-logind[793]: New session 27 of user zuul.
Jan 30 04:16:58 np0005601978 systemd[1]: Started Session 27 of User zuul.
Jan 30 04:16:59 np0005601978 python3.9[183385]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 30 04:17:01 np0005601978 python3.9[183541]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:17:01 np0005601978 systemd[1]: Reloading.
Jan 30 04:17:01 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:17:01 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:17:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:17:01Z|00068|chassis|WARN|Dropped 1 log messages in last 25 seconds (most recently, 25 seconds ago) due to excessive rate
Jan 30 04:17:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:17:01Z|00069|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:17:02 np0005601978 python3.9[183725]: ansible-ansible.builtin.service_facts Invoked
Jan 30 04:17:02 np0005601978 network[183742]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 30 04:17:02 np0005601978 network[183743]: 'network-scripts' will be removed from distribution in near future.
Jan 30 04:17:02 np0005601978 network[183744]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 30 04:17:03 np0005601978 nova_compute[182955]: 2026-01-30 09:17:03.451 182959 ERROR oslo.messaging._drivers.impl_rabbit [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] Connection failed: timed out (retrying in 3.0 seconds): socket.timeout: timed out#033[00m
Jan 30 04:17:03 np0005601978 podman[183777]: 2026-01-30 09:17:03.552217125 +0000 UTC m=+0.075311328 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:17:06 np0005601978 podman[184007]: 2026-01-30 09:17:06.96697078 +0000 UTC m=+0.086042349 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 30 04:17:07 np0005601978 python3.9[184054]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:17:08 np0005601978 python3.9[184214]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:08 np0005601978 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:17:08 np0005601978 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 30 04:17:08 np0005601978 python3.9[184367]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:09 np0005601978 python3.9[184519]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:17:10 np0005601978 python3.9[184671]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 30 04:17:11 np0005601978 nova_compute[182955]: 2026-01-30 09:17:11.480 182959 ERROR oslo.messaging._drivers.impl_rabbit [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] Connection failed: timed out (retrying in 5.0 seconds): socket.timeout: timed out#033[00m
Jan 30 04:17:11 np0005601978 python3.9[184823]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:17:11 np0005601978 systemd[1]: Reloading.
Jan 30 04:17:11 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:17:11 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:17:12 np0005601978 python3.9[185010]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:17:13 np0005601978 python3.9[185163]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:17:14 np0005601978 python3.9[185313]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:17:15 np0005601978 python3.9[185467]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 30 04:17:16 np0005601978 python3.9[185619]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 30 04:17:16 np0005601978 nova_compute[182955]: 2026-01-30 09:17:16.982 182959 INFO nova.virt.driver [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.026 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:17:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:17.028 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.075 182959 INFO nova.compute.provider_config [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.084 182959 DEBUG oslo_concurrency.lockutils [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.085 182959 DEBUG oslo_concurrency.lockutils [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.085 182959 DEBUG oslo_concurrency.lockutils [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.085 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.085 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.085 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.085 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.086 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.086 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.086 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.086 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.086 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.086 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.086 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.086 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.087 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.087 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.087 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.087 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.087 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.087 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.087 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.088 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.088 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.088 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.088 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.088 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.088 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.088 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.088 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.089 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.089 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.089 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.089 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.089 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.089 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.090 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.090 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.090 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.090 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.090 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.090 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.090 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.091 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.091 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.091 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.091 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.091 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.091 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.091 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.092 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.092 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.092 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.092 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.092 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.092 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.092 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.093 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.093 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.093 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.093 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.093 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.093 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.093 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.094 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.094 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.094 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.094 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.094 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.094 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.094 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.094 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.095 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.095 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.095 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.095 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.095 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.095 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.095 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.096 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.096 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.096 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.096 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.096 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.096 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.096 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.097 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.097 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.097 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.097 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.097 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.097 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.097 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.097 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.098 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.098 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.098 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.098 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.098 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.098 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.099 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.099 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.099 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.099 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.099 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.099 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.099 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.100 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.100 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.100 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.100 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.100 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.100 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.100 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.100 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.101 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.101 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.101 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.101 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.101 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.101 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.101 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.102 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.102 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.102 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.102 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.102 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.102 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.102 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.102 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.103 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.103 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.103 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.103 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.103 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.103 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.103 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.104 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.104 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.104 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.104 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.104 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.104 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.104 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.105 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.105 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.105 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.105 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.105 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.105 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.105 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.105 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.106 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.106 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.106 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.106 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.106 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.106 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.106 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.107 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.107 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.107 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.107 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.107 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.107 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.108 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.108 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.108 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.108 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.108 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.108 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.108 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.109 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.109 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.109 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.109 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.109 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.109 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.109 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.110 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.110 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.110 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.110 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.110 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.110 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.110 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.110 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.111 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.111 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.111 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.111 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.111 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.111 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.111 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.112 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.112 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.112 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.112 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.112 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.112 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.112 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.113 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.113 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.113 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.113 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.113 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.113 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.113 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.113 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.114 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.114 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.114 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.114 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.114 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.114 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.114 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.115 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.115 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.115 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.115 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.115 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.115 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.115 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.116 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.116 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.116 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.116 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.116 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.116 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.116 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.117 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.117 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.117 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.117 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.117 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.117 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.117 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.117 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.118 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.118 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.118 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.118 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.118 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.118 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.118 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.119 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.119 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.119 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.119 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.119 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.119 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.119 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.119 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.120 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.120 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.120 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.120 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.120 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.120 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.120 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.121 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.121 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.121 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.121 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.121 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.121 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.121 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.122 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.122 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.122 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.122 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.122 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.122 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.122 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.122 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.123 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.123 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.123 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.123 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.123 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.123 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.123 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.124 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.124 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.124 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.124 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.124 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.124 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.124 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.125 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.125 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.125 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.125 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.125 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.125 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.125 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.126 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.126 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.126 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.126 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.126 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.126 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.127 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.127 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.127 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.127 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.127 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.127 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.127 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.128 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.128 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.128 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.128 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.128 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.128 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.128 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.129 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.129 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.129 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.129 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.129 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.130 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.130 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.130 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.130 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.130 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.130 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.131 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.131 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.131 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.131 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.131 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.132 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.132 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.132 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.132 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.132 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.132 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.133 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.133 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.133 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.133 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.133 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.133 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.133 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.134 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.134 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.134 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.134 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.134 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.134 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.134 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.135 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.135 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.135 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.135 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.135 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.135 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.136 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.136 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.136 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.136 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.136 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.137 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.137 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.137 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.137 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.137 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.137 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.137 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.138 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.138 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.138 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.138 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.138 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.138 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.138 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.139 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.139 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.139 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.139 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.139 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.139 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.139 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.139 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.140 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.140 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.140 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.140 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.140 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.140 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.140 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.141 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.141 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.141 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.141 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.141 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.141 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.141 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.142 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.142 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.142 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.142 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.142 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.142 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.142 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.142 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.143 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.143 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.143 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.143 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.143 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.144 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.144 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.144 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.144 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.144 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.144 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.144 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.145 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.145 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.145 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.145 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.145 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.145 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.145 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.146 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.146 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.146 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.146 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.146 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.147 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.147 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.147 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.147 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.147 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.147 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.147 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.148 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.148 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.148 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.148 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.148 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.148 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.148 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.148 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.149 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.149 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.149 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.149 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.149 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.150 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.150 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.150 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.150 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.150 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.151 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.151 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.151 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.151 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.151 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.152 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.152 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.152 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.152 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.152 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.153 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.153 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.153 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.153 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.153 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.153 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.154 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.154 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.154 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.154 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.154 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.155 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.155 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.155 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.155 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.155 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.155 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.156 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.156 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.156 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.156 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.156 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.157 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.157 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.157 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.157 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.157 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.157 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.158 182959 WARNING oslo_config.cfg [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 30 04:17:17 np0005601978 nova_compute[182955]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 30 04:17:17 np0005601978 nova_compute[182955]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 30 04:17:17 np0005601978 nova_compute[182955]: and ``live_migration_inbound_addr`` respectively.
Jan 30 04:17:17 np0005601978 nova_compute[182955]: ).  Its value may be silently ignored in the future.#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.158 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.158 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.158 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.158 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.158 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.159 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.159 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.159 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.159 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.159 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.159 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.160 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.160 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.160 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.160 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.160 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.160 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.161 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.161 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.161 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.161 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.161 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.161 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.161 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.161 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.162 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.162 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.162 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.162 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.162 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.162 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.163 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.163 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.163 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.163 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.163 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.164 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.164 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.164 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.164 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.164 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.164 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.164 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.165 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.165 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.165 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.165 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.165 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.165 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.165 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.166 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.166 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.166 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.166 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.166 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.166 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.166 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.167 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.167 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.167 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.167 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.167 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.167 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.167 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.168 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.168 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.168 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.168 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.168 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.168 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.168 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.169 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.169 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.169 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.169 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.169 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.169 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.169 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.170 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.170 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.170 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.170 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.170 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.170 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.170 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.171 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.171 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.171 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.171 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.171 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.172 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.172 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.172 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.172 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.172 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.172 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.173 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.173 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.173 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.173 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.173 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.173 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.174 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.174 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.174 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.174 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.174 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.174 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.174 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.175 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.175 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.175 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.175 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.175 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.175 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.176 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.176 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.176 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.176 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.176 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.176 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.176 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.177 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.177 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.177 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.177 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.177 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.177 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.178 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.178 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.178 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.178 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.178 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.178 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.178 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.179 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.179 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.179 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.179 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.180 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.180 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.180 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.180 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.180 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.180 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.181 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.181 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.181 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.181 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.181 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.181 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.182 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.182 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.182 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.182 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.182 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.182 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.183 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.183 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.183 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.183 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.183 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.183 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.184 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.184 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.184 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.184 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.184 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.184 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.184 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.185 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.185 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.185 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.185 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.185 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.185 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.185 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.186 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.186 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.186 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.186 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.186 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.186 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.187 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.187 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.187 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.187 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.187 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.187 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.187 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.188 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.188 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.188 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.188 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.188 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.189 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.189 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.189 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.189 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.189 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.190 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.190 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.190 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.190 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.190 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.191 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.191 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.191 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.191 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.191 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.191 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.192 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.192 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.192 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.192 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.192 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.192 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.193 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.193 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.193 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.193 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.193 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.193 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.194 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.194 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.194 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.194 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.194 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.194 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.194 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.194 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.195 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.195 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.195 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.195 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.195 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.195 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.195 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.196 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.196 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.196 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.196 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.196 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.196 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.197 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.197 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.197 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.197 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.197 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.198 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.198 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.198 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.198 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.198 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.198 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.198 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.199 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.199 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.199 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.199 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.199 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.199 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.199 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.200 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.200 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.200 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.200 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.200 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.200 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.200 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.201 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.201 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.201 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.201 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.201 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.201 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.201 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.201 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.202 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.202 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.202 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.202 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.202 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.202 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.202 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.203 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.203 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.203 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.203 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.203 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.203 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.203 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.204 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.204 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.204 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.204 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.204 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.204 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.204 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.205 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.205 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.205 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.205 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.205 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.205 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.205 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.205 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.206 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.206 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.206 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.206 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.206 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.206 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.206 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.207 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.207 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.207 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.207 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.207 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.207 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.207 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.208 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.208 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.208 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.208 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.208 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.208 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.208 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.209 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.209 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.209 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.209 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.209 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.209 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.209 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.210 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.210 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.210 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.210 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.210 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.210 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.210 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.211 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.211 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.211 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.211 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.211 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.211 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.211 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.212 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.212 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.212 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.212 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.212 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.212 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.212 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.212 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.213 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.213 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.213 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.213 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.213 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.213 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.213 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.214 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.214 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.214 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.214 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.214 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.214 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.214 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.214 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.215 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.215 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.215 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.215 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.215 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.215 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.215 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.216 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.216 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.216 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.216 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.216 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.216 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.216 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.217 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.217 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.217 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.217 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.217 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.217 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.217 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.217 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.218 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.218 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.218 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.218 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.218 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.218 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.218 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.219 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.219 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.219 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.219 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.219 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.219 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.220 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.220 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.220 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.220 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.220 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.220 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.220 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.221 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.221 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.221 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.221 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.221 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.221 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.221 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.222 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.222 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.222 182959 DEBUG oslo_service.service [None req-9157fa9a-ddc7-4035-9045-a7c0ae8432c7 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.222 182959 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.238 182959 DEBUG nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.238 182959 DEBUG nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.239 182959 DEBUG nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.239 182959 DEBUG nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 30 04:17:17 np0005601978 systemd[1]: Starting libvirt QEMU daemon...
Jan 30 04:17:17 np0005601978 systemd[1]: Started libvirt QEMU daemon.
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.300 182959 DEBUG nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f610ac387f0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.303 182959 DEBUG nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f610ac387f0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.304 182959 INFO nova.virt.libvirt.driver [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.320 182959 WARNING nova.virt.libvirt.driver [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Jan 30 04:17:17 np0005601978 nova_compute[182955]: 2026-01-30 09:17:17.320 182959 DEBUG nova.virt.libvirt.volume.mount [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 30 04:17:17 np0005601978 python3.9[185824]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.201 182959 INFO nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Libvirt host capabilities <capabilities>
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <host>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <uuid>5c6fc8cf-a2fa-4ff8-aeb9-2a548fe8efbe</uuid>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <cpu>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <arch>x86_64</arch>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model>EPYC-Rome-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <vendor>AMD</vendor>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <microcode version='16777317'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <signature family='23' model='49' stepping='0'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='x2apic'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='tsc-deadline'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='osxsave'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='hypervisor'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='tsc_adjust'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='spec-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='stibp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='arch-capabilities'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='ssbd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='cmp_legacy'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='topoext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='virt-ssbd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='lbrv'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='tsc-scale'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='vmcb-clean'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='pause-filter'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='pfthreshold'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='svme-addr-chk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='rdctl-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='skip-l1dfl-vmentry'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='mds-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature name='pschange-mc-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <pages unit='KiB' size='4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <pages unit='KiB' size='2048'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <pages unit='KiB' size='1048576'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </cpu>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <power_management>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <suspend_mem/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <suspend_disk/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <suspend_hybrid/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </power_management>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <iommu support='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <migration_features>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <live/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <uri_transports>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <uri_transport>tcp</uri_transport>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <uri_transport>rdma</uri_transport>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </uri_transports>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </migration_features>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <topology>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <cells num='1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <cell id='0'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:          <memory unit='KiB'>7864292</memory>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:          <pages unit='KiB' size='4'>1966073</pages>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:          <pages unit='KiB' size='2048'>0</pages>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:          <distances>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:            <sibling id='0' value='10'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:          </distances>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:          <cpus num='8'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:          </cpus>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        </cell>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </cells>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </topology>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <cache>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </cache>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <secmodel>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model>selinux</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <doi>0</doi>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </secmodel>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <secmodel>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model>dac</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <doi>0</doi>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </secmodel>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </host>
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <guest>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <os_type>hvm</os_type>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <arch name='i686'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <wordsize>32</wordsize>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <domain type='qemu'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <domain type='kvm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </arch>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <features>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <pae/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <nonpae/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <acpi default='on' toggle='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <apic default='on' toggle='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <cpuselection/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <deviceboot/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <disksnapshot default='on' toggle='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <externalSnapshot/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </features>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </guest>
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <guest>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <os_type>hvm</os_type>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <arch name='x86_64'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <wordsize>64</wordsize>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <domain type='qemu'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <domain type='kvm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </arch>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <features>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <acpi default='on' toggle='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <apic default='on' toggle='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <cpuselection/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <deviceboot/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <disksnapshot default='on' toggle='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <externalSnapshot/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </features>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </guest>
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 
Jan 30 04:17:18 np0005601978 nova_compute[182955]: </capabilities>
Jan 30 04:17:18 np0005601978 nova_compute[182955]: #033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.211 182959 DEBUG nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.228 182959 DEBUG nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 30 04:17:18 np0005601978 nova_compute[182955]: <domainCapabilities>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <path>/usr/libexec/qemu-kvm</path>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <domain>kvm</domain>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <arch>i686</arch>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <vcpu max='4096'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <iothreads supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <os supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <enum name='firmware'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <loader supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>rom</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pflash</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='readonly'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>yes</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>no</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='secure'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>no</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </loader>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </os>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <cpu>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='host-passthrough' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='hostPassthroughMigratable'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>on</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>off</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='maximum' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='maximumMigratable'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>on</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>off</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='host-model' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <vendor>AMD</vendor>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='x2apic'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='tsc-deadline'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='hypervisor'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='tsc_adjust'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='spec-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='stibp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='ssbd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='cmp_legacy'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='overflow-recov'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='succor'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='amd-ssbd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='virt-ssbd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='lbrv'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='tsc-scale'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='vmcb-clean'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='flushbyasid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='pause-filter'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='pfthreshold'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='svme-addr-chk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='disable' name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='custom' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='ClearwaterForest'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ddpd-u'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sha512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm3'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='ClearwaterForest-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ddpd-u'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sha512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm3'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cooperlake'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cooperlake-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cooperlake-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Dhyana-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Genoa'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Genoa-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Genoa-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='perfmon-v2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Turin'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibpb-brtype'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='perfmon-v2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbpb'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Turin-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibpb-brtype'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='perfmon-v2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbpb'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-128'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-256'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-128'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-256'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v6'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v7'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='KnightsMill'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4fmaps'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4vnniw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512er'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512pf'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='KnightsMill-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4fmaps'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4vnniw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512er'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512pf'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G4-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tbm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G5-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tbm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='athlon'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='athlon-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='core2duo'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='core2duo-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='coreduo'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='coreduo-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='n270'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='n270-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='phenom'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='phenom-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </cpu>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <memoryBacking supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <enum name='sourceType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>file</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>anonymous</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>memfd</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </memoryBacking>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <devices>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <disk supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='diskDevice'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>disk</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>cdrom</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>floppy</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>lun</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='bus'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>fdc</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>scsi</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>usb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>sata</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-non-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </disk>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <graphics supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vnc</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>egl-headless</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>dbus</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </graphics>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <video supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='modelType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vga</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>cirrus</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>none</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>bochs</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>ramfb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </video>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <hostdev supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='mode'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>subsystem</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='startupPolicy'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>default</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>mandatory</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>requisite</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>optional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='subsysType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>usb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pci</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>scsi</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='capsType'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='pciBackend'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </hostdev>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <rng supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-non-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendModel'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>random</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>egd</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>builtin</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </rng>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <filesystem supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='driverType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>path</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>handle</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtiofs</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </filesystem>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <tpm supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tpm-tis</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tpm-crb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendModel'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>emulator</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>external</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendVersion'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>2.0</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </tpm>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <redirdev supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='bus'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>usb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </redirdev>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <channel supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pty</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>unix</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </channel>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <crypto supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>qemu</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendModel'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>builtin</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </crypto>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <interface supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>default</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>passt</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </interface>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <panic supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>isa</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>hyperv</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </panic>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <console supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>null</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vc</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pty</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>dev</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>file</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pipe</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>stdio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>udp</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tcp</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>unix</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>qemu-vdagent</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>dbus</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </console>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </devices>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <features>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <gic supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <vmcoreinfo supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <genid supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <backingStoreInput supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <backup supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <async-teardown supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <s390-pv supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <ps2 supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <tdx supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <sev supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <sgx supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <hyperv supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='features'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>relaxed</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vapic</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>spinlocks</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vpindex</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>runtime</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>synic</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>stimer</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>reset</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vendor_id</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>frequencies</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>reenlightenment</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tlbflush</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>ipi</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>avic</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>emsr_bitmap</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>xmm_input</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <defaults>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <spinlocks>4095</spinlocks>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <stimer_direct>on</stimer_direct>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <tlbflush_direct>on</tlbflush_direct>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <tlbflush_extended>on</tlbflush_extended>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </defaults>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </hyperv>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <launchSecurity supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </features>
Jan 30 04:17:18 np0005601978 nova_compute[182955]: </domainCapabilities>
Jan 30 04:17:18 np0005601978 nova_compute[182955]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.237 182959 DEBUG nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 30 04:17:18 np0005601978 nova_compute[182955]: <domainCapabilities>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <path>/usr/libexec/qemu-kvm</path>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <domain>kvm</domain>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <arch>i686</arch>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <vcpu max='240'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <iothreads supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <os supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <enum name='firmware'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <loader supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>rom</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pflash</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='readonly'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>yes</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>no</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='secure'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>no</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </loader>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </os>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <cpu>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='host-passthrough' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='hostPassthroughMigratable'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>on</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>off</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='maximum' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='maximumMigratable'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>on</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>off</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='host-model' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <vendor>AMD</vendor>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='x2apic'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='tsc-deadline'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='hypervisor'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='tsc_adjust'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='spec-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='stibp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='ssbd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='cmp_legacy'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='overflow-recov'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='succor'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='amd-ssbd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='virt-ssbd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='lbrv'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='tsc-scale'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='vmcb-clean'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='flushbyasid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='pause-filter'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='pfthreshold'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='svme-addr-chk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='disable' name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='custom' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='ClearwaterForest'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ddpd-u'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sha512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm3'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='ClearwaterForest-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ddpd-u'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sha512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm3'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cooperlake'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cooperlake-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cooperlake-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Dhyana-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Genoa'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Genoa-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Genoa-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='perfmon-v2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Turin'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibpb-brtype'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='perfmon-v2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbpb'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Turin-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibpb-brtype'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='perfmon-v2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbpb'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-128'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-256'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-128'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-256'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v6'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v7'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='KnightsMill'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4fmaps'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4vnniw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512er'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512pf'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='KnightsMill-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4fmaps'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4vnniw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512er'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512pf'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G4-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tbm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G5-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tbm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='athlon'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='athlon-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='core2duo'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='core2duo-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='coreduo'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='coreduo-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='n270'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='n270-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='phenom'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='phenom-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </cpu>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <memoryBacking supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <enum name='sourceType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>file</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>anonymous</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>memfd</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </memoryBacking>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <devices>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <disk supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='diskDevice'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>disk</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>cdrom</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>floppy</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>lun</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='bus'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>ide</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>fdc</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>scsi</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>usb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>sata</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-non-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </disk>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <graphics supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vnc</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>egl-headless</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>dbus</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </graphics>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <video supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='modelType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vga</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>cirrus</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>none</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>bochs</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>ramfb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </video>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <hostdev supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='mode'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>subsystem</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='startupPolicy'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>default</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>mandatory</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>requisite</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>optional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='subsysType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>usb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pci</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>scsi</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='capsType'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='pciBackend'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </hostdev>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <rng supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-non-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendModel'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>random</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>egd</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>builtin</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </rng>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <filesystem supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='driverType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>path</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>handle</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtiofs</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </filesystem>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <tpm supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tpm-tis</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tpm-crb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendModel'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>emulator</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>external</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendVersion'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>2.0</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </tpm>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <redirdev supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='bus'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>usb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </redirdev>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <channel supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pty</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>unix</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </channel>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <crypto supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>qemu</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendModel'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>builtin</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </crypto>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <interface supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>default</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>passt</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </interface>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <panic supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>isa</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>hyperv</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </panic>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <console supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>null</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vc</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pty</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>dev</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>file</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pipe</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>stdio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>udp</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tcp</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>unix</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>qemu-vdagent</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>dbus</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </console>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </devices>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <features>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <gic supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <vmcoreinfo supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <genid supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <backingStoreInput supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <backup supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <async-teardown supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <s390-pv supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <ps2 supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <tdx supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <sev supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <sgx supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <hyperv supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='features'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>relaxed</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vapic</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>spinlocks</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vpindex</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>runtime</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>synic</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>stimer</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>reset</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vendor_id</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>frequencies</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>reenlightenment</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tlbflush</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>ipi</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>avic</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>emsr_bitmap</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>xmm_input</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <defaults>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <spinlocks>4095</spinlocks>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <stimer_direct>on</stimer_direct>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <tlbflush_direct>on</tlbflush_direct>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <tlbflush_extended>on</tlbflush_extended>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </defaults>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </hyperv>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <launchSecurity supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </features>
Jan 30 04:17:18 np0005601978 nova_compute[182955]: </domainCapabilities>
Jan 30 04:17:18 np0005601978 nova_compute[182955]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.316 182959 DEBUG nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.320 182959 DEBUG nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 30 04:17:18 np0005601978 nova_compute[182955]: <domainCapabilities>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <path>/usr/libexec/qemu-kvm</path>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <domain>kvm</domain>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <arch>x86_64</arch>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <vcpu max='4096'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <iothreads supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <os supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <enum name='firmware'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>efi</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <loader supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>rom</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pflash</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='readonly'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>yes</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>no</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='secure'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>yes</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>no</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </loader>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </os>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <cpu>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='host-passthrough' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='hostPassthroughMigratable'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>on</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>off</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='maximum' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='maximumMigratable'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>on</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>off</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='host-model' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <vendor>AMD</vendor>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='x2apic'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='tsc-deadline'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='hypervisor'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='tsc_adjust'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='spec-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='stibp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='ssbd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='cmp_legacy'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='overflow-recov'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='succor'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='amd-ssbd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='virt-ssbd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='lbrv'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='tsc-scale'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='vmcb-clean'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='flushbyasid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='pause-filter'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='pfthreshold'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='svme-addr-chk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='disable' name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='custom' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='ClearwaterForest'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ddpd-u'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sha512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm3'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='ClearwaterForest-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ddpd-u'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sha512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm3'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cooperlake'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cooperlake-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cooperlake-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Dhyana-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Genoa'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Genoa-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Genoa-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='perfmon-v2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Turin'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibpb-brtype'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='perfmon-v2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbpb'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Turin-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibpb-brtype'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='perfmon-v2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbpb'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-128'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-256'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-128'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-256'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v6'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v7'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='KnightsMill'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4fmaps'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4vnniw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512er'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512pf'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='KnightsMill-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4fmaps'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4vnniw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512er'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512pf'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G4-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tbm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G5-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tbm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='athlon'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='athlon-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='core2duo'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='core2duo-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='coreduo'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='coreduo-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='n270'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='n270-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='phenom'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='phenom-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </cpu>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <memoryBacking supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <enum name='sourceType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>file</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>anonymous</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>memfd</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </memoryBacking>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <devices>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <disk supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='diskDevice'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>disk</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>cdrom</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>floppy</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>lun</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='bus'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>fdc</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>scsi</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>usb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>sata</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-non-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </disk>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <graphics supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vnc</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>egl-headless</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>dbus</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </graphics>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <video supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='modelType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vga</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>cirrus</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>none</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>bochs</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>ramfb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </video>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <hostdev supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='mode'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>subsystem</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='startupPolicy'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>default</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>mandatory</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>requisite</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>optional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='subsysType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>usb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pci</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>scsi</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='capsType'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='pciBackend'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </hostdev>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <rng supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-non-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendModel'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>random</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>egd</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>builtin</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </rng>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <filesystem supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='driverType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>path</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>handle</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtiofs</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </filesystem>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <tpm supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tpm-tis</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tpm-crb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendModel'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>emulator</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>external</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendVersion'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>2.0</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </tpm>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <redirdev supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='bus'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>usb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </redirdev>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <channel supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pty</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>unix</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </channel>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <crypto supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>qemu</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendModel'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>builtin</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </crypto>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <interface supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>default</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>passt</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </interface>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <panic supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>isa</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>hyperv</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </panic>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <console supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>null</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vc</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pty</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>dev</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>file</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pipe</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>stdio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>udp</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tcp</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>unix</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>qemu-vdagent</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>dbus</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </console>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </devices>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <features>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <gic supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <vmcoreinfo supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <genid supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <backingStoreInput supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <backup supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <async-teardown supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <s390-pv supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <ps2 supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <tdx supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <sev supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <sgx supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <hyperv supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='features'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>relaxed</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vapic</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>spinlocks</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vpindex</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>runtime</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>synic</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>stimer</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>reset</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vendor_id</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>frequencies</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>reenlightenment</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tlbflush</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>ipi</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>avic</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>emsr_bitmap</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>xmm_input</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <defaults>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <spinlocks>4095</spinlocks>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <stimer_direct>on</stimer_direct>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <tlbflush_direct>on</tlbflush_direct>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <tlbflush_extended>on</tlbflush_extended>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </defaults>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </hyperv>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <launchSecurity supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </features>
Jan 30 04:17:18 np0005601978 nova_compute[182955]: </domainCapabilities>
Jan 30 04:17:18 np0005601978 nova_compute[182955]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.394 182959 DEBUG nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 30 04:17:18 np0005601978 nova_compute[182955]: <domainCapabilities>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <path>/usr/libexec/qemu-kvm</path>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <domain>kvm</domain>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <arch>x86_64</arch>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <vcpu max='240'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <iothreads supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <os supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <enum name='firmware'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <loader supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>rom</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pflash</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='readonly'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>yes</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>no</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='secure'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>no</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </loader>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </os>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <cpu>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='host-passthrough' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='hostPassthroughMigratable'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>on</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>off</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='maximum' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='maximumMigratable'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>on</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>off</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='host-model' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <vendor>AMD</vendor>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='x2apic'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='tsc-deadline'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='hypervisor'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='tsc_adjust'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='spec-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='stibp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='ssbd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='cmp_legacy'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='overflow-recov'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='succor'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='amd-ssbd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='virt-ssbd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='lbrv'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='tsc-scale'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='vmcb-clean'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='flushbyasid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='pause-filter'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='pfthreshold'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='svme-addr-chk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <feature policy='disable' name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <mode name='custom' supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Broadwell-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cascadelake-Server-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='ClearwaterForest'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ddpd-u'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sha512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm3'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='ClearwaterForest-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ddpd-u'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sha512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm3'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sm4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cooperlake'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cooperlake-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Cooperlake-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Denverton-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Dhyana-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Genoa'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Genoa-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Genoa-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='perfmon-v2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Milan-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Rome-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Turin'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibpb-brtype'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='perfmon-v2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbpb'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-Turin-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amd-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='auto-ibrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vp2intersect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fs-gs-base-ns'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibpb-brtype'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='no-nested-data-bp'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='null-sel-clr-base'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='perfmon-v2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbpb'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='srso-user-kernel-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='stibp-always-on'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='EPYC-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-128'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-256'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='GraniteRapids-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-128'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-256'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx10-512'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='prefetchiti'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Haswell-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-noTSX'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v6'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Icelake-Server-v7'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='IvyBridge-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='KnightsMill'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4fmaps'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4vnniw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512er'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512pf'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='KnightsMill-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4fmaps'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-4vnniw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512er'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512pf'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G4-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tbm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Opteron_G5-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fma4'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tbm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xop'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SapphireRapids-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='amx-tile'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-bf16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-fp16'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512-vpopcntdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bitalg'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vbmi2'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrc'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fzrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='la57'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='taa-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='tsx-ldtrk'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='SierraForest-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ifma'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-ne-convert'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx-vnni-int8'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bhi-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='bus-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cmpccxadd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fbsdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='fsrs'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ibrs-all'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='intel-psfd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ipred-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='lam'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mcdt-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pbrsb-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='psdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rrsba-ctrl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='sbdr-ssdp-no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='serialize'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vaes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='vpclmulqdq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Client-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='hle'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='rtm'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Skylake-Server-v5'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512bw'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512cd'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512dq'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512f'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='avx512vl'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='invpcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pcid'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='pku'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='mpx'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v2'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v3'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='core-capability'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='split-lock-detect'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='Snowridge-v4'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='cldemote'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='erms'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='gfni'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdir64b'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='movdiri'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='xsaves'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='athlon'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='athlon-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='core2duo'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='core2duo-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='coreduo'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='coreduo-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='n270'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='n270-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='ss'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='phenom'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <blockers model='phenom-v1'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnow'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <feature name='3dnowext'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </blockers>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </mode>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </cpu>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <memoryBacking supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <enum name='sourceType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>file</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>anonymous</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <value>memfd</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </memoryBacking>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <devices>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <disk supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='diskDevice'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>disk</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>cdrom</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>floppy</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>lun</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='bus'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>ide</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>fdc</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>scsi</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>usb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>sata</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-non-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </disk>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <graphics supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vnc</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>egl-headless</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>dbus</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </graphics>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <video supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='modelType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vga</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>cirrus</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>none</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>bochs</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>ramfb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </video>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <hostdev supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='mode'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>subsystem</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='startupPolicy'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>default</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>mandatory</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>requisite</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>optional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='subsysType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>usb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pci</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>scsi</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='capsType'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='pciBackend'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </hostdev>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <rng supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtio-non-transitional</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendModel'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>random</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>egd</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>builtin</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </rng>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <filesystem supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='driverType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>path</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>handle</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>virtiofs</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </filesystem>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <tpm supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tpm-tis</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tpm-crb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendModel'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>emulator</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>external</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendVersion'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>2.0</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </tpm>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <redirdev supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='bus'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>usb</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </redirdev>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <channel supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pty</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>unix</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </channel>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <crypto supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>qemu</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendModel'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>builtin</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </crypto>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <interface supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='backendType'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>default</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>passt</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </interface>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <panic supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='model'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>isa</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>hyperv</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </panic>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <console supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='type'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>null</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vc</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pty</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>dev</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>file</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>pipe</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>stdio</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>udp</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tcp</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>unix</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>qemu-vdagent</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>dbus</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </console>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </devices>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <features>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <gic supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <vmcoreinfo supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <genid supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <backingStoreInput supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <backup supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <async-teardown supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <s390-pv supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <ps2 supported='yes'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <tdx supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <sev supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <sgx supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <hyperv supported='yes'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <enum name='features'>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>relaxed</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vapic</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>spinlocks</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vpindex</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>runtime</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>synic</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>stimer</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>reset</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>vendor_id</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>frequencies</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>reenlightenment</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>tlbflush</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>ipi</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>avic</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>emsr_bitmap</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <value>xmm_input</value>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </enum>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      <defaults>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <spinlocks>4095</spinlocks>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <stimer_direct>on</stimer_direct>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <tlbflush_direct>on</tlbflush_direct>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <tlbflush_extended>on</tlbflush_extended>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:      </defaults>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    </hyperv>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:    <launchSecurity supported='no'/>
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  </features>
Jan 30 04:17:18 np0005601978 nova_compute[182955]: </domainCapabilities>
Jan 30 04:17:18 np0005601978 nova_compute[182955]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.458 182959 DEBUG nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.459 182959 INFO nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Secure Boot support detected#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.461 182959 INFO nova.virt.libvirt.driver [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.461 182959 INFO nova.virt.libvirt.driver [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.471 182959 DEBUG nova.virt.libvirt.driver [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] cpu compare xml: <cpu match="exact">
Jan 30 04:17:18 np0005601978 nova_compute[182955]:  <model>Nehalem</model>
Jan 30 04:17:18 np0005601978 nova_compute[182955]: </cpu>
Jan 30 04:17:18 np0005601978 nova_compute[182955]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.473 182959 DEBUG nova.virt.libvirt.driver [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.494 182959 INFO nova.virt.node [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Determined node identity 5912bad0-7860-4f37-8078-1db5720295f4 from /var/lib/nova/compute_id#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.514 182959 WARNING nova.compute.manager [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Compute nodes ['5912bad0-7860-4f37-8078-1db5720295f4'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.550 182959 INFO nova.compute.manager [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.588 182959 WARNING nova.compute.manager [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.588 182959 DEBUG oslo_concurrency.lockutils [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.588 182959 DEBUG oslo_concurrency.lockutils [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.588 182959 DEBUG oslo_concurrency.lockutils [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.589 182959 DEBUG nova.compute.resource_tracker [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:17:18 np0005601978 systemd[1]: Starting libvirt nodedev daemon...
Jan 30 04:17:18 np0005601978 systemd[1]: Started libvirt nodedev daemon.
Jan 30 04:17:18 np0005601978 python3.9[185994]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.843 182959 WARNING nova.virt.libvirt.driver [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.845 182959 DEBUG nova.compute.resource_tracker [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6174MB free_disk=73.5787582397461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.845 182959 DEBUG oslo_concurrency.lockutils [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.845 182959 DEBUG oslo_concurrency.lockutils [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.856 182959 WARNING nova.compute.resource_tracker [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] No compute node record for compute-1.ctlplane.example.com:5912bad0-7860-4f37-8078-1db5720295f4: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 5912bad0-7860-4f37-8078-1db5720295f4 could not be found.#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.871 182959 INFO nova.compute.resource_tracker [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: 5912bad0-7860-4f37-8078-1db5720295f4#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.912 182959 DEBUG nova.compute.resource_tracker [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:17:18 np0005601978 nova_compute[182955]: 2026-01-30 09:17:18.913 182959 DEBUG nova.compute.resource_tracker [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:17:19 np0005601978 nova_compute[182955]: 2026-01-30 09:17:19.056 182959 INFO nova.scheduler.client.report [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] [req-82bf99bf-6742-45f6-8511-b95cc2b0b20a] Created resource provider record via placement API for resource provider with UUID 5912bad0-7860-4f37-8078-1db5720295f4 and name compute-1.ctlplane.example.com.#033[00m
Jan 30 04:17:19 np0005601978 nova_compute[182955]: 2026-01-30 09:17:19.110 182959 DEBUG nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 30 04:17:19 np0005601978 nova_compute[182955]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Jan 30 04:17:19 np0005601978 nova_compute[182955]: 2026-01-30 09:17:19.111 182959 INFO nova.virt.libvirt.host [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] kernel doesn't support AMD SEV#033[00m
Jan 30 04:17:19 np0005601978 nova_compute[182955]: 2026-01-30 09:17:19.112 182959 DEBUG nova.compute.provider_tree [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Updating inventory in ProviderTree for provider 5912bad0-7860-4f37-8078-1db5720295f4 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:17:19 np0005601978 nova_compute[182955]: 2026-01-30 09:17:19.112 182959 DEBUG nova.virt.libvirt.driver [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 30 04:17:19 np0005601978 nova_compute[182955]: 2026-01-30 09:17:19.116 182959 DEBUG nova.virt.libvirt.driver [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Libvirt baseline CPU <cpu>
Jan 30 04:17:19 np0005601978 nova_compute[182955]:  <arch>x86_64</arch>
Jan 30 04:17:19 np0005601978 nova_compute[182955]:  <model>Nehalem</model>
Jan 30 04:17:19 np0005601978 nova_compute[182955]:  <vendor>AMD</vendor>
Jan 30 04:17:19 np0005601978 nova_compute[182955]:  <topology sockets="8" cores="1" threads="1"/>
Jan 30 04:17:19 np0005601978 nova_compute[182955]: </cpu>
Jan 30 04:17:19 np0005601978 nova_compute[182955]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Jan 30 04:17:19 np0005601978 nova_compute[182955]: 2026-01-30 09:17:19.171 182959 DEBUG nova.scheduler.client.report [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Updated inventory for provider 5912bad0-7860-4f37-8078-1db5720295f4 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 30 04:17:19 np0005601978 nova_compute[182955]: 2026-01-30 09:17:19.172 182959 DEBUG nova.compute.provider_tree [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Updating resource provider 5912bad0-7860-4f37-8078-1db5720295f4 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 30 04:17:19 np0005601978 nova_compute[182955]: 2026-01-30 09:17:19.172 182959 DEBUG nova.compute.provider_tree [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Updating inventory in ProviderTree for provider 5912bad0-7860-4f37-8078-1db5720295f4 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:17:19 np0005601978 nova_compute[182955]: 2026-01-30 09:17:19.258 182959 DEBUG nova.compute.provider_tree [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Updating resource provider 5912bad0-7860-4f37-8078-1db5720295f4 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 30 04:17:19 np0005601978 nova_compute[182955]: 2026-01-30 09:17:19.294 182959 DEBUG nova.compute.resource_tracker [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:17:19 np0005601978 nova_compute[182955]: 2026-01-30 09:17:19.294 182959 DEBUG oslo_concurrency.lockutils [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.449s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:17:19 np0005601978 nova_compute[182955]: 2026-01-30 09:17:19.295 182959 DEBUG nova.service [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Jan 30 04:17:19 np0005601978 nova_compute[182955]: 2026-01-30 09:17:19.366 182959 DEBUG nova.service [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Jan 30 04:17:19 np0005601978 nova_compute[182955]: 2026-01-30 09:17:19.367 182959 DEBUG nova.servicegroup.drivers.db [None req-0a8b570a-660a-4069-93b9-765e3dc5b8ce - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Jan 30 04:17:20 np0005601978 python3.9[186175]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:17:21 np0005601978 python3.9[186296]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769764640.2033234-514-253340648464607/.source.conf _original_basename=ceilometer.conf follow=False checksum=806b21daa538a66a80669be8bf74c414d178dfbc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:21 np0005601978 python3.9[186446]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:17:22 np0005601978 python3.9[186567]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769764641.3436677-514-134575463561680/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:22 np0005601978 python3.9[186717]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:17:23 np0005601978 python3.9[186838]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769764642.2803738-514-30976141968736/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:24 np0005601978 python3.9[186988]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:17:24 np0005601978 python3.9[187140]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:17:25 np0005601978 python3.9[187292]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:17:25 np0005601978 python3.9[187413]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764645.1263301-691-234287230861740/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:17:26 np0005601978 python3.9[187563]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:17:26 np0005601978 python3.9[187684]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764646.0653477-691-242157509105311/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:17:27 np0005601978 python3.9[187834]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:17:28 np0005601978 python3.9[187955]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764647.327953-778-204919133937244/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:17:28 np0005601978 python3.9[188105]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:17:29 np0005601978 python3.9[188226]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764648.5034175-826-168399423268260/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:30 np0005601978 python3.9[188376]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:17:30 np0005601978 python3.9[188497]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764649.7958508-871-253494907371272/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:31 np0005601978 python3.9[188647]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:17:31 np0005601978 python3.9[188768]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764651.0388687-916-867759694778/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:32 np0005601978 python3.9[188920]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:33 np0005601978 python3.9[189072]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:33 np0005601978 podman[189196]: 2026-01-30 09:17:33.829109176 +0000 UTC m=+0.053247640 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 30 04:17:33 np0005601978 python3.9[189238]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:17:34 np0005601978 python3.9[189394]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:17:35 np0005601978 python3.9[189546]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:17:36 np0005601978 python3.9[189700]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:17:37 np0005601978 python3.9[189852]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:17:37 np0005601978 systemd[1]: Reloading.
Jan 30 04:17:37 np0005601978 ovn_controller[95419]: 2026-01-30T09:17:37Z|00070|chassis|WARN|Dropped 6 log messages in last 30 seconds (most recently, 28 seconds ago) due to excessive rate
Jan 30 04:17:37 np0005601978 ovn_controller[95419]: 2026-01-30T09:17:37Z|00071|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:17:37 np0005601978 podman[189854]: 2026-01-30 09:17:37.405340052 +0000 UTC m=+0.090051007 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:17:37 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:17:37 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:17:37 np0005601978 systemd[1]: Listening on Podman API Socket.
Jan 30 04:17:38 np0005601978 python3.9[190070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:17:39 np0005601978 python3.9[190193]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764658.0217984-1132-763291395414/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:17:39 np0005601978 python3.9[190269]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:17:39 np0005601978 python3.9[190392]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764658.0217984-1132-763291395414/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:17:41 np0005601978 python3.9[190544]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:42 np0005601978 python3.9[190696]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:17:42 np0005601978 python3.9[190848]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:17:43 np0005601978 python3.9[190971]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764662.4441595-1276-118364262542204/.source.json _original_basename=.obdrt4u1 follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:44 np0005601978 python3.9[191121]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:46 np0005601978 python3.9[191544]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Jan 30 04:17:47 np0005601978 python3.9[191696]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 30 04:17:48 np0005601978 python3[191848]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Jan 30 04:17:49 np0005601978 podman[191883]: 2026-01-30 09:17:49.055408513 +0000 UTC m=+0.024833915 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 30 04:17:49 np0005601978 podman[191883]: 2026-01-30 09:17:49.266836807 +0000 UTC m=+0.236262219 container create 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:17:49 np0005601978 python3[191848]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Jan 30 04:17:50 np0005601978 python3.9[192074]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:17:51 np0005601978 python3.9[192228]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:51 np0005601978 python3.9[192304]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:17:52 np0005601978 nova_compute[182955]: 2026-01-30 09:17:52.370 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:17:52 np0005601978 python3.9[192455]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769764671.7063746-1510-67444367413020/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:53 np0005601978 python3.9[192531]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:17:53 np0005601978 systemd[1]: Reloading.
Jan 30 04:17:53 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:17:53 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:17:53 np0005601978 python3.9[192642]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:17:54 np0005601978 systemd[1]: Reloading.
Jan 30 04:17:54 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:17:54 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:17:54 np0005601978 systemd[1]: Starting ceilometer_agent_compute container...
Jan 30 04:17:54 np0005601978 systemd[1]: Started libcrun container.
Jan 30 04:17:54 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a09550228b5ccd018427b10f16f39c48527446e37c670613746823b7aa571825/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 30 04:17:54 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a09550228b5ccd018427b10f16f39c48527446e37c670613746823b7aa571825/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Jan 30 04:17:54 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a09550228b5ccd018427b10f16f39c48527446e37c670613746823b7aa571825/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Jan 30 04:17:54 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a09550228b5ccd018427b10f16f39c48527446e37c670613746823b7aa571825/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Jan 30 04:17:54 np0005601978 systemd[1]: Started /usr/bin/podman healthcheck run 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6.
Jan 30 04:17:54 np0005601978 podman[192681]: 2026-01-30 09:17:54.524362271 +0000 UTC m=+0.175301417 container init 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: + sudo -E kolla_set_configs
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: sudo: unable to send audit message: Operation not permitted
Jan 30 04:17:54 np0005601978 podman[192681]: 2026-01-30 09:17:54.570793427 +0000 UTC m=+0.221732573 container start 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 30 04:17:54 np0005601978 podman[192681]: ceilometer_agent_compute
Jan 30 04:17:54 np0005601978 systemd[1]: Started ceilometer_agent_compute container.
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: INFO:__main__:Validating config file
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: INFO:__main__:Copying service configuration files
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: INFO:__main__:Writing out command to execute
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: ++ cat /run_command
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: + ARGS=
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: + sudo kolla_copy_cacerts
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: sudo: unable to send audit message: Operation not permitted
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: + [[ ! -n '' ]]
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: + . kolla_extend_start
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: + umask 0022
Jan 30 04:17:54 np0005601978 ceilometer_agent_compute[192697]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Jan 30 04:17:54 np0005601978 podman[192704]: 2026-01-30 09:17:54.676990936 +0000 UTC m=+0.095001508 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 30 04:17:54 np0005601978 systemd[1]: 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6-1e7ec61f5046b029.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:17:54 np0005601978 systemd[1]: 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6-1e7ec61f5046b029.service: Failed with result 'exit-code'.
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.530 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.530 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.530 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.530 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.530 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.531 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.531 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.531 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.531 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.531 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.531 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.531 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.531 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.531 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.531 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.531 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.531 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.532 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.532 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.532 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.532 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.532 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.532 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.532 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.532 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.532 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.532 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.532 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.532 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.532 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.533 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.533 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.533 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.533 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.533 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.533 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.533 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.533 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.533 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.533 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.533 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.533 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.533 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.533 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.534 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.534 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.534 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.534 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.534 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.534 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.534 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.534 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.534 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.534 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.534 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.534 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.534 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.535 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.535 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.535 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.535 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.535 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.535 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.535 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.535 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.535 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.536 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.536 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.536 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.536 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.536 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.536 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.536 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.536 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.536 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.536 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.536 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.537 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.537 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.537 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.537 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.537 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.537 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.537 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.537 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.537 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.537 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.537 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.538 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.538 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.538 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.538 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.538 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.538 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.538 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.538 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.538 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.538 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.538 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.538 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.539 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.539 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.539 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.539 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.539 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.539 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.539 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.539 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.539 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.539 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.539 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.540 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.540 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.540 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.540 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.540 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.540 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.540 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.540 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.540 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.540 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.540 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.540 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.541 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.541 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.541 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.541 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.541 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.541 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.541 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.541 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.541 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.541 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.541 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.541 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.542 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.542 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.542 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.542 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.542 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.542 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.542 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.542 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.542 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.542 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.542 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.542 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.542 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.543 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.543 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.543 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.543 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.543 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.543 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.543 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.543 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.543 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.543 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.543 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.543 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.544 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.544 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.544 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.544 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.544 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.566 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.568 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.569 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.652 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.717 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.718 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.718 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.718 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.718 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.718 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.718 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.718 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.718 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.718 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.718 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.719 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.719 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.719 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.719 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.719 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.719 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.719 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.719 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.719 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.720 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.720 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.720 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.720 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.720 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.720 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.720 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.720 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.720 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.720 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.720 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.720 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.720 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.720 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.721 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.721 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.721 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.721 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.721 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.721 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.721 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.721 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.721 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.721 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.721 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.721 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.722 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.722 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.722 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.722 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.722 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.722 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.722 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.722 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.722 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.722 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.722 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.723 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.723 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.723 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.723 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.723 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.723 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.723 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.723 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.723 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.723 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.723 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.723 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.724 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.724 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.724 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.724 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.724 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.724 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.724 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.724 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.724 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.724 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.724 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.724 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.725 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.725 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.725 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.725 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.725 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.725 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.725 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.725 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.725 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.725 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.725 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.726 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.726 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.726 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.726 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.726 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.726 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.726 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.726 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.726 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.726 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.726 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.727 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.727 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.727 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.727 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.727 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.727 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.727 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.727 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.727 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.727 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.727 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.728 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.728 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.728 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.728 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.728 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.728 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.728 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.728 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.728 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.728 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.728 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.728 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.729 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.729 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.729 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.729 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.729 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.729 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.729 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.729 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.729 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.729 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.729 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.729 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.730 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.730 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.730 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.730 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.730 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.730 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.730 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.730 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.730 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.730 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.730 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.730 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.730 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.730 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.731 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.731 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.731 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.731 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.731 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.731 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.731 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.731 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.731 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.731 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.731 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.731 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.732 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.732 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.732 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.732 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.732 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.732 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.732 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.732 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.732 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.732 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.732 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.732 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.732 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.733 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.733 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.733 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.733 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.733 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.733 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.733 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.733 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.733 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.733 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.733 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.733 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.733 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.733 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.734 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.734 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.734 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.734 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.734 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.734 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.734 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.734 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.734 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.734 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.734 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.734 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.735 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.735 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.735 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.735 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.735 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.735 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.735 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.735 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.735 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.735 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.735 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.735 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.736 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.736 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.736 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.736 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.740 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.750 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:17:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:17:55 np0005601978 python3.9[192880]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 30 04:17:57 np0005601978 python3.9[193035]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:17:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:57.322 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:17:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:57.324 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:17:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:17:57.325 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:17:57 np0005601978 python3.9[193160]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764676.5435765-1645-184919992341285/.source.yaml _original_basename=.qeg1_wq4 follow=False checksum=8780e83b750d8b2ad67c8598db1dab4e122e7afe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:17:58 np0005601978 python3.9[193312]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:17:59 np0005601978 python3.9[193435]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764677.9054937-1690-195933776695094/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:17:59 np0005601978 nova_compute[182955]: 2026-01-30 09:17:59.383 182959 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: timed out (retrying in 1.0 seconds): socket.timeout: timed out#033[00m
Jan 30 04:18:00 np0005601978 python3.9[193587]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:01 np0005601978 python3.9[193739]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:18:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:18:01Z|00072|chassis|WARN|Dropped 1 log messages in last 24 seconds (most recently, 24 seconds ago) due to excessive rate
Jan 30 04:18:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:18:01Z|00073|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:18:01 np0005601978 python3.9[193891]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:18:02 np0005601978 python3.9[193969]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.zvof4g14 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:02 np0005601978 python3.9[194119]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:04 np0005601978 podman[194364]: 2026-01-30 09:18:04.054895209 +0000 UTC m=+0.076324960 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 30 04:18:05 np0005601978 python3.9[194560]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Jan 30 04:18:05 np0005601978 nova_compute[182955]: 2026-01-30 09:18:05.396 182959 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: timed out (retrying in 3.0 seconds): socket.timeout: timed out#033[00m
Jan 30 04:18:06 np0005601978 python3.9[194712]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 30 04:18:07 np0005601978 python3[194864]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 30 04:18:07 np0005601978 podman[194898]: 2026-01-30 09:18:07.645386499 +0000 UTC m=+0.048383076 container create e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:18:07 np0005601978 podman[194898]: 2026-01-30 09:18:07.623539701 +0000 UTC m=+0.026536278 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 30 04:18:07 np0005601978 python3[194864]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Jan 30 04:18:08 np0005601978 podman[194959]: 2026-01-30 09:18:08.46512588 +0000 UTC m=+0.113677507 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:18:08 np0005601978 python3.9[195113]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:18:09 np0005601978 python3.9[195267]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:10 np0005601978 python3.9[195343]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:18:11 np0005601978 python3.9[195494]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769764690.4160783-2026-223954831781974/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:11 np0005601978 python3.9[195570]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:18:11 np0005601978 systemd[1]: Reloading.
Jan 30 04:18:11 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:18:11 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:18:12 np0005601978 python3.9[195681]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:18:12 np0005601978 systemd[1]: Reloading.
Jan 30 04:18:12 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:18:12 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:18:12 np0005601978 systemd[1]: Starting node_exporter container...
Jan 30 04:18:12 np0005601978 systemd[1]: Started libcrun container.
Jan 30 04:18:12 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d471d74994e6eb2030359204cc9a4cd8efde9a99be269ec4f38a56b976979eb4/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 30 04:18:12 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d471d74994e6eb2030359204cc9a4cd8efde9a99be269ec4f38a56b976979eb4/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 30 04:18:12 np0005601978 systemd[1]: Started /usr/bin/podman healthcheck run e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901.
Jan 30 04:18:12 np0005601978 podman[195721]: 2026-01-30 09:18:12.925719667 +0000 UTC m=+0.129713231 container init e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.943Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.943Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.943Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.944Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.944Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.945Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.945Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.945Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.945Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=arp
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=bcache
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=bonding
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=cpu
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=edac
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=filefd
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=netclass
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=netdev
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=netstat
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=nfs
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=nvme
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=softnet
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=systemd
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=xfs
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.946Z caller=node_exporter.go:117 level=info collector=zfs
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.947Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Jan 30 04:18:12 np0005601978 node_exporter[195737]: ts=2026-01-30T09:18:12.948Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Jan 30 04:18:12 np0005601978 podman[195721]: 2026-01-30 09:18:12.951546956 +0000 UTC m=+0.155540490 container start e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:18:12 np0005601978 podman[195721]: node_exporter
Jan 30 04:18:12 np0005601978 systemd[1]: Started node_exporter container.
Jan 30 04:18:13 np0005601978 podman[195746]: 2026-01-30 09:18:13.024300984 +0000 UTC m=+0.065691391 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:18:13 np0005601978 nova_compute[182955]: 2026-01-30 09:18:13.262 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 8.89 sec#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.269 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.269 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.269 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.295 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.295 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.295 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.295 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:18:14 np0005601978 python3.9[195919]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.428 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.430 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6034MB free_disk=73.57783889770508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.430 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.431 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.505 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.506 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.527 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.542 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.543 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.544 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.544 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.603 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.604 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.604 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.605 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.620 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.621 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.621 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.622 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.622 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:14 np0005601978 nova_compute[182955]: 2026-01-30 09:18:14.623 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:15 np0005601978 python3.9[196071]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:18:16 np0005601978 python3.9[196196]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764695.4850278-2161-77450857538050/.source.yaml _original_basename=.071vf36q follow=False checksum=42128c20150d024023dad565fc076bdb6d93d087 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:17 np0005601978 python3.9[196348]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:18:17 np0005601978 python3.9[196471]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764696.7587233-2206-76904426537137/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:18:19 np0005601978 python3.9[196623]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:20 np0005601978 python3.9[196775]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:18:20 np0005601978 python3.9[196927]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:18:21 np0005601978 python3.9[197005]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.jz0rook_ recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:22 np0005601978 python3.9[197155]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:24 np0005601978 python3.9[197578]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 30 04:18:25 np0005601978 podman[197702]: 2026-01-30 09:18:25.051629407 +0000 UTC m=+0.080495153 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 30 04:18:25 np0005601978 systemd[1]: 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6-1e7ec61f5046b029.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:18:25 np0005601978 systemd[1]: 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6-1e7ec61f5046b029.service: Failed with result 'exit-code'.
Jan 30 04:18:25 np0005601978 python3.9[197748]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 30 04:18:26 np0005601978 python3[197900]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 30 04:18:27 np0005601978 podman[197914]: 2026-01-30 09:18:27.362231134 +0000 UTC m=+1.142931574 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 30 04:18:27 np0005601978 podman[198007]: 2026-01-30 09:18:27.453533919 +0000 UTC m=+0.034384955 container create a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter)
Jan 30 04:18:27 np0005601978 podman[198007]: 2026-01-30 09:18:27.435310551 +0000 UTC m=+0.016161607 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 30 04:18:27 np0005601978 python3[197900]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 30 04:18:28 np0005601978 python3.9[198196]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:18:29 np0005601978 python3.9[198350]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:29 np0005601978 python3.9[198426]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:18:30 np0005601978 python3.9[198577]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769764709.8252444-2542-74897519929693/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:30 np0005601978 python3.9[198653]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:18:30 np0005601978 systemd[1]: Reloading.
Jan 30 04:18:30 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:18:30 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:18:31 np0005601978 python3.9[198764]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:18:31 np0005601978 systemd[1]: Reloading.
Jan 30 04:18:31 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:18:31 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:18:32 np0005601978 systemd[1]: Starting podman_exporter container...
Jan 30 04:18:32 np0005601978 systemd[1]: Started libcrun container.
Jan 30 04:18:32 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b13b381d0735a0dd0406c526616d39c8be9ca6a331182856d2f10fe9c0f590c9/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 30 04:18:32 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b13b381d0735a0dd0406c526616d39c8be9ca6a331182856d2f10fe9c0f590c9/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 30 04:18:32 np0005601978 systemd[1]: Started /usr/bin/podman healthcheck run a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc.
Jan 30 04:18:32 np0005601978 podman[198804]: 2026-01-30 09:18:32.170023027 +0000 UTC m=+0.152723399 container init a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:18:32 np0005601978 podman_exporter[198819]: ts=2026-01-30T09:18:32.184Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 30 04:18:32 np0005601978 podman_exporter[198819]: ts=2026-01-30T09:18:32.184Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 30 04:18:32 np0005601978 podman_exporter[198819]: ts=2026-01-30T09:18:32.184Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 30 04:18:32 np0005601978 podman_exporter[198819]: ts=2026-01-30T09:18:32.184Z caller=handler.go:105 level=info collector=container
Jan 30 04:18:32 np0005601978 podman[198804]: 2026-01-30 09:18:32.195927848 +0000 UTC m=+0.178628180 container start a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:18:32 np0005601978 systemd[1]: Starting Podman API Service...
Jan 30 04:18:32 np0005601978 systemd[1]: Started Podman API Service.
Jan 30 04:18:32 np0005601978 podman[198804]: podman_exporter
Jan 30 04:18:32 np0005601978 systemd[1]: Started podman_exporter container.
Jan 30 04:18:32 np0005601978 podman[198830]: time="2026-01-30T09:18:32Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 30 04:18:32 np0005601978 podman[198830]: time="2026-01-30T09:18:32Z" level=info msg="Setting parallel job count to 25"
Jan 30 04:18:32 np0005601978 podman[198830]: time="2026-01-30T09:18:32Z" level=info msg="Using sqlite as database backend"
Jan 30 04:18:32 np0005601978 podman[198830]: time="2026-01-30T09:18:32Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 30 04:18:32 np0005601978 podman[198830]: time="2026-01-30T09:18:32Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 30 04:18:32 np0005601978 podman[198830]: time="2026-01-30T09:18:32Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 30 04:18:32 np0005601978 podman[198830]: @ - - [30/Jan/2026:09:18:32 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 30 04:18:32 np0005601978 podman[198830]: time="2026-01-30T09:18:32Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 30 04:18:32 np0005601978 podman[198829]: 2026-01-30 09:18:32.292367732 +0000 UTC m=+0.084640219 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:18:32 np0005601978 podman[198830]: @ - - [30/Jan/2026:09:18:32 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18074 "" "Go-http-client/1.1"
Jan 30 04:18:32 np0005601978 podman_exporter[198819]: ts=2026-01-30T09:18:32.294Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 30 04:18:32 np0005601978 podman_exporter[198819]: ts=2026-01-30T09:18:32.294Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 30 04:18:32 np0005601978 podman_exporter[198819]: ts=2026-01-30T09:18:32.295Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 30 04:18:32 np0005601978 systemd[1]: a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc-4c54bd199f8bb259.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:18:32 np0005601978 systemd[1]: a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc-4c54bd199f8bb259.service: Failed with result 'exit-code'.
Jan 30 04:18:33 np0005601978 python3.9[199013]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 30 04:18:34 np0005601978 podman[199038]: 2026-01-30 09:18:34.439043219 +0000 UTC m=+0.096817814 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 30 04:18:35 np0005601978 python3.9[199185]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:18:35 np0005601978 python3.9[199310]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764714.6935027-2677-169960745928571/.source.yaml _original_basename=.r5zzt21l follow=False checksum=441100862386c9bbe2f594fa4146745dc81605f2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:36 np0005601978 python3.9[199462]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:18:37 np0005601978 python3.9[199585]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769764716.1271574-2722-136580004284459/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:18:38 np0005601978 python3.9[199737]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:39 np0005601978 ovn_controller[95419]: 2026-01-30T09:18:39Z|00074|chassis|WARN|Dropped 6 log messages in last 31 seconds (most recently, 30 seconds ago) due to excessive rate
Jan 30 04:18:39 np0005601978 ovn_controller[95419]: 2026-01-30T09:18:39Z|00075|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:18:39 np0005601978 podman[199861]: 2026-01-30 09:18:39.037194292 +0000 UTC m=+0.081867759 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 30 04:18:39 np0005601978 python3.9[199901]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 30 04:18:39 np0005601978 python3.9[200066]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:18:40 np0005601978 python3.9[200144]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.gtpp4yal recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:40 np0005601978 python3.9[200294]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:43 np0005601978 podman[200670]: 2026-01-30 09:18:43.383655056 +0000 UTC m=+0.046414959 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:18:43 np0005601978 python3.9[200741]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 30 04:18:44 np0005601978 python3.9[200894]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 30 04:18:45 np0005601978 python3[201046]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 30 04:18:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:46.834 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:18:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:46.834 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:18:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:46.836 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:18:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:46.836 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:18:47 np0005601978 podman[201059]: 2026-01-30 09:18:47.681089574 +0000 UTC m=+2.033187553 image pull 2679468753c61ac8a0e14904b347eedc3a9181a15e3bff0987683c22e1f9cae7 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 30 04:18:47 np0005601978 podman[201157]: 2026-01-30 09:18:47.774491516 +0000 UTC m=+0.040237976 container create 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, release=1769056855)
Jan 30 04:18:47 np0005601978 podman[201157]: 2026-01-30 09:18:47.750445709 +0000 UTC m=+0.016192159 image pull 2679468753c61ac8a0e14904b347eedc3a9181a15e3bff0987683c22e1f9cae7 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 30 04:18:47 np0005601978 python3[201046]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 30 04:18:47 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:47.842 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:18:47 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:47.842 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:18:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:48.844 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:18:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:48.844 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:18:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:48.844 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:18:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:48.845 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:18:49 np0005601978 python3.9[201346]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:18:50 np0005601978 python3.9[201500]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:50 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:50.848 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:18:50 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:50.849 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:18:50 np0005601978 python3.9[201576]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:18:51 np0005601978 python3.9[201727]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769764730.9317756-3058-121560838746078/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:52 np0005601978 python3.9[201803]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 30 04:18:52 np0005601978 systemd[1]: Reloading.
Jan 30 04:18:52 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:18:52 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:18:52 np0005601978 nova_compute[182955]: 2026-01-30 09:18:52.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:18:52 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:52.851 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:18:52 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:52.852 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:18:52 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:52.852 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:18:52 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:52.852 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:18:53 np0005601978 python3.9[201914]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 30 04:18:53 np0005601978 systemd[1]: Reloading.
Jan 30 04:18:53 np0005601978 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 30 04:18:53 np0005601978 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 30 04:18:53 np0005601978 systemd[1]: Starting openstack_network_exporter container...
Jan 30 04:18:53 np0005601978 systemd[1]: Started libcrun container.
Jan 30 04:18:53 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8203439279a69a6c77df882fde9f332d0f7719f67df1d3c175e93b17d0d07379/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 30 04:18:53 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8203439279a69a6c77df882fde9f332d0f7719f67df1d3c175e93b17d0d07379/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 30 04:18:53 np0005601978 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8203439279a69a6c77df882fde9f332d0f7719f67df1d3c175e93b17d0d07379/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 30 04:18:53 np0005601978 systemd[1]: Started /usr/bin/podman healthcheck run 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28.
Jan 30 04:18:53 np0005601978 podman[201954]: 2026-01-30 09:18:53.633135066 +0000 UTC m=+0.230969443 container init 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public)
Jan 30 04:18:53 np0005601978 openstack_network_exporter[201970]: INFO    09:18:53 main.go:48: registering *bridge.Collector
Jan 30 04:18:53 np0005601978 openstack_network_exporter[201970]: INFO    09:18:53 main.go:48: registering *coverage.Collector
Jan 30 04:18:53 np0005601978 openstack_network_exporter[201970]: INFO    09:18:53 main.go:48: registering *datapath.Collector
Jan 30 04:18:53 np0005601978 openstack_network_exporter[201970]: INFO    09:18:53 main.go:48: registering *iface.Collector
Jan 30 04:18:53 np0005601978 openstack_network_exporter[201970]: INFO    09:18:53 main.go:48: registering *memory.Collector
Jan 30 04:18:53 np0005601978 openstack_network_exporter[201970]: INFO    09:18:53 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 30 04:18:53 np0005601978 openstack_network_exporter[201970]: INFO    09:18:53 main.go:48: registering *ovn.Collector
Jan 30 04:18:53 np0005601978 openstack_network_exporter[201970]: INFO    09:18:53 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 30 04:18:53 np0005601978 openstack_network_exporter[201970]: INFO    09:18:53 main.go:48: registering *pmd_perf.Collector
Jan 30 04:18:53 np0005601978 openstack_network_exporter[201970]: INFO    09:18:53 main.go:48: registering *pmd_rxq.Collector
Jan 30 04:18:53 np0005601978 openstack_network_exporter[201970]: INFO    09:18:53 main.go:48: registering *vswitch.Collector
Jan 30 04:18:53 np0005601978 openstack_network_exporter[201970]: NOTICE  09:18:53 main.go:76: listening on https://:9105/metrics
Jan 30 04:18:53 np0005601978 podman[201954]: 2026-01-30 09:18:53.669212921 +0000 UTC m=+0.267047178 container start 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1769056855, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible)
Jan 30 04:18:53 np0005601978 podman[201954]: openstack_network_exporter
Jan 30 04:18:53 np0005601978 systemd[1]: Started openstack_network_exporter container.
Jan 30 04:18:53 np0005601978 podman[201980]: 2026-01-30 09:18:53.783369232 +0000 UTC m=+0.104400364 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., version=9.7, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, architecture=x86_64, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 30 04:18:55 np0005601978 python3.9[202152]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 30 04:18:55 np0005601978 podman[202177]: 2026-01-30 09:18:55.407221961 +0000 UTC m=+0.063490071 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:18:55 np0005601978 systemd[1]: 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6-1e7ec61f5046b029.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:18:55 np0005601978 systemd[1]: 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6-1e7ec61f5046b029.service: Failed with result 'exit-code'.
Jan 30 04:18:56 np0005601978 auditd[701]: Audit daemon rotating log files
Jan 30 04:18:56 np0005601978 python3.9[202324]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:18:56 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:56.859 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:18:56 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:56.860 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:18:56 np0005601978 python3.9[202449]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764735.8301265-3193-18499593330434/.source.yaml _original_basename=.w5ksqvk2 follow=False checksum=c5fad1fa35e900faf5323c50ca144cf21063d1e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:18:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:57.324 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:18:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:57.324 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:18:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:18:57.325 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:18:57 np0005601978 python3.9[202601]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 30 04:19:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:19:00.860 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:19:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:19:00.861 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:19:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:19:00.866 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:19:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:19:00.866 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:19:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:19:01Z|00076|chassis|WARN|Dropped 2 log messages in last 22 seconds (most recently, 22 seconds ago) due to excessive rate
Jan 30 04:19:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:19:01Z|00077|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:19:02 np0005601978 podman[202626]: 2026-01-30 09:19:02.394307603 +0000 UTC m=+0.047384105 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:19:05 np0005601978 podman[202648]: 2026-01-30 09:19:05.438042625 +0000 UTC m=+0.094228215 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 30 04:19:08 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:19:08.879 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:19:08 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:19:08.883 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:19:09 np0005601978 podman[202667]: 2026-01-30 09:19:09.47911957 +0000 UTC m=+0.135158118 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.055 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.056 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.056 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.077 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.077 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.078 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.078 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.078 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.078 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.079 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.079 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.079 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.105 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.106 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.106 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.106 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.277 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.279 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5911MB free_disk=73.36659622192383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.279 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.280 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.339 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.340 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.364 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.380 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.382 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:19:10 np0005601978 nova_compute[182955]: 2026-01-30 09:19:10.383 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:19:11 np0005601978 nova_compute[182955]: 2026-01-30 09:19:11.076 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 17.81 sec#033[00m
Jan 30 04:19:11 np0005601978 nova_compute[182955]: 2026-01-30 09:19:11.379 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:14 np0005601978 podman[202693]: 2026-01-30 09:19:14.424950332 +0000 UTC m=+0.088986455 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:19:24 np0005601978 podman[202719]: 2026-01-30 09:19:24.398770021 +0000 UTC m=+0.056508477 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1769056855, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 30 04:19:26 np0005601978 podman[202741]: 2026-01-30 09:19:26.409054776 +0000 UTC m=+0.063786978 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=4, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 30 04:19:26 np0005601978 systemd[1]: 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6-1e7ec61f5046b029.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:19:26 np0005601978 systemd[1]: 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6-1e7ec61f5046b029.service: Failed with result 'exit-code'.
Jan 30 04:19:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:19:29.138 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:19:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:19:29.138 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:19:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:19:29.140 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:19:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:19:29.140 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:19:33 np0005601978 podman[202760]: 2026-01-30 09:19:33.406154222 +0000 UTC m=+0.071908264 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:19:36 np0005601978 podman[202784]: 2026-01-30 09:19:36.415031203 +0000 UTC m=+0.068680168 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 30 04:19:37 np0005601978 python3.9[202931]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 30 04:19:37 np0005601978 python3.9[203097]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:38 np0005601978 systemd[1]: Started libpod-conmon-4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089.scope.
Jan 30 04:19:38 np0005601978 podman[203098]: 2026-01-30 09:19:38.049034382 +0000 UTC m=+0.092855009 container exec 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 30 04:19:38 np0005601978 podman[203098]: 2026-01-30 09:19:38.084962852 +0000 UTC m=+0.128783469 container exec_died 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:19:38 np0005601978 systemd[1]: libpod-conmon-4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089.scope: Deactivated successfully.
Jan 30 04:19:38 np0005601978 python3.9[203280]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:38 np0005601978 systemd[1]: Started libpod-conmon-4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089.scope.
Jan 30 04:19:38 np0005601978 podman[203281]: 2026-01-30 09:19:38.939783413 +0000 UTC m=+0.109907250 container exec 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 30 04:19:38 np0005601978 podman[203281]: 2026-01-30 09:19:38.97105821 +0000 UTC m=+0.141182037 container exec_died 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 30 04:19:39 np0005601978 systemd[1]: libpod-conmon-4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089.scope: Deactivated successfully.
Jan 30 04:19:39 np0005601978 python3.9[203462]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:40 np0005601978 ovn_controller[95419]: 2026-01-30T09:19:40Z|00078|chassis|WARN|Dropped 4 log messages in last 31 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:19:40 np0005601978 ovn_controller[95419]: 2026-01-30T09:19:40Z|00079|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:19:40 np0005601978 podman[203586]: 2026-01-30 09:19:40.240828271 +0000 UTC m=+0.114228404 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Jan 30 04:19:40 np0005601978 python3.9[203627]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 30 04:19:41 np0005601978 python3.9[203805]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:41 np0005601978 systemd[1]: Started libpod-conmon-6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959.scope.
Jan 30 04:19:41 np0005601978 podman[203806]: 2026-01-30 09:19:41.189963917 +0000 UTC m=+0.082320860 container exec 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Jan 30 04:19:41 np0005601978 podman[203806]: 2026-01-30 09:19:41.221032908 +0000 UTC m=+0.113389891 container exec_died 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 30 04:19:41 np0005601978 systemd[1]: libpod-conmon-6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959.scope: Deactivated successfully.
Jan 30 04:19:42 np0005601978 python3.9[203989]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:42 np0005601978 systemd[1]: Started libpod-conmon-6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959.scope.
Jan 30 04:19:42 np0005601978 podman[203990]: 2026-01-30 09:19:42.139632666 +0000 UTC m=+0.089676284 container exec 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 30 04:19:42 np0005601978 podman[203990]: 2026-01-30 09:19:42.177717375 +0000 UTC m=+0.127760973 container exec_died 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:19:42 np0005601978 systemd[1]: libpod-conmon-6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959.scope: Deactivated successfully.
Jan 30 04:19:42 np0005601978 python3.9[204174]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:43 np0005601978 python3.9[204326]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Jan 30 04:19:44 np0005601978 python3.9[204491]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:44 np0005601978 systemd[1]: Started libpod-conmon-33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6.scope.
Jan 30 04:19:44 np0005601978 podman[204492]: 2026-01-30 09:19:44.425421843 +0000 UTC m=+0.102539204 container exec 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 30 04:19:44 np0005601978 podman[204492]: 2026-01-30 09:19:44.454758089 +0000 UTC m=+0.131875470 container exec_died 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 30 04:19:44 np0005601978 systemd[1]: libpod-conmon-33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6.scope: Deactivated successfully.
Jan 30 04:19:44 np0005601978 podman[204523]: 2026-01-30 09:19:44.550827771 +0000 UTC m=+0.056888376 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:19:45 np0005601978 python3.9[204698]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:45 np0005601978 systemd[1]: Started libpod-conmon-33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6.scope.
Jan 30 04:19:45 np0005601978 podman[204699]: 2026-01-30 09:19:45.210075966 +0000 UTC m=+0.094485591 container exec 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:19:45 np0005601978 podman[204699]: 2026-01-30 09:19:45.24118456 +0000 UTC m=+0.125594135 container exec_died 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:19:45 np0005601978 systemd[1]: libpod-conmon-33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6.scope: Deactivated successfully.
Jan 30 04:19:45 np0005601978 python3.9[204882]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:46 np0005601978 python3.9[205034]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Jan 30 04:19:47 np0005601978 python3.9[205199]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:47 np0005601978 systemd[1]: Started libpod-conmon-e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901.scope.
Jan 30 04:19:47 np0005601978 podman[205200]: 2026-01-30 09:19:47.54700096 +0000 UTC m=+0.093828190 container exec e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:19:47 np0005601978 podman[205200]: 2026-01-30 09:19:47.582326206 +0000 UTC m=+0.129153436 container exec_died e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:19:47 np0005601978 systemd[1]: libpod-conmon-e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901.scope: Deactivated successfully.
Jan 30 04:19:48 np0005601978 python3.9[205382]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:48 np0005601978 systemd[1]: Started libpod-conmon-e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901.scope.
Jan 30 04:19:48 np0005601978 podman[205383]: 2026-01-30 09:19:48.352744779 +0000 UTC m=+0.071944316 container exec e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:19:48 np0005601978 podman[205400]: 2026-01-30 09:19:48.407587119 +0000 UTC m=+0.047738371 container exec_died e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:19:48 np0005601978 podman[205383]: 2026-01-30 09:19:48.41236557 +0000 UTC m=+0.131565057 container exec_died e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:19:48 np0005601978 systemd[1]: libpod-conmon-e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901.scope: Deactivated successfully.
Jan 30 04:19:49 np0005601978 python3.9[205566]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:49 np0005601978 python3.9[205718]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Jan 30 04:19:50 np0005601978 python3.9[205883]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:50 np0005601978 systemd[1]: Started libpod-conmon-a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc.scope.
Jan 30 04:19:50 np0005601978 podman[205884]: 2026-01-30 09:19:50.477807228 +0000 UTC m=+0.072961092 container exec a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:19:50 np0005601978 podman[205884]: 2026-01-30 09:19:50.51221905 +0000 UTC m=+0.107372904 container exec_died a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:19:50 np0005601978 systemd[1]: libpod-conmon-a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc.scope: Deactivated successfully.
Jan 30 04:19:51 np0005601978 python3.9[206067]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:51 np0005601978 systemd[1]: Started libpod-conmon-a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc.scope.
Jan 30 04:19:51 np0005601978 podman[206068]: 2026-01-30 09:19:51.346879131 +0000 UTC m=+0.089932271 container exec a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:19:51 np0005601978 podman[206068]: 2026-01-30 09:19:51.38077802 +0000 UTC m=+0.123831080 container exec_died a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:19:51 np0005601978 systemd[1]: libpod-conmon-a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc.scope: Deactivated successfully.
Jan 30 04:19:52 np0005601978 python3.9[206250]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:52 np0005601978 python3.9[206402]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Jan 30 04:19:53 np0005601978 python3.9[206568]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:53 np0005601978 systemd[1]: Started libpod-conmon-95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28.scope.
Jan 30 04:19:53 np0005601978 podman[206569]: 2026-01-30 09:19:53.522504792 +0000 UTC m=+0.088611188 container exec 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, config_id=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 30 04:19:53 np0005601978 podman[206569]: 2026-01-30 09:19:53.55794385 +0000 UTC m=+0.124050196 container exec_died 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, config_id=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z)
Jan 30 04:19:53 np0005601978 systemd[1]: libpod-conmon-95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28.scope: Deactivated successfully.
Jan 30 04:19:54 np0005601978 python3.9[206754]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 30 04:19:54 np0005601978 systemd[1]: Started libpod-conmon-95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28.scope.
Jan 30 04:19:54 np0005601978 podman[206755]: 2026-01-30 09:19:54.30653826 +0000 UTC m=+0.072819537 container exec 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1769056855, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vcs-type=git, version=9.7, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, architecture=x86_64, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 30 04:19:54 np0005601978 podman[206755]: 2026-01-30 09:19:54.339113366 +0000 UTC m=+0.105394583 container exec_died 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 30 04:19:54 np0005601978 systemd[1]: libpod-conmon-95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28.scope: Deactivated successfully.
Jan 30 04:19:54 np0005601978 nova_compute[182955]: 2026-01-30 09:19:54.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:19:54 np0005601978 nova_compute[182955]: 2026-01-30 09:19:54.436 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:19:54 np0005601978 nova_compute[182955]: 2026-01-30 09:19:54.436 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:19:54 np0005601978 podman[206909]: 2026-01-30 09:19:54.910838602 +0000 UTC m=+0.095692848 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, architecture=x86_64, config_id=openstack_network_exporter, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 30 04:19:55 np0005601978 python3.9[206954]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:19:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:19:55 np0005601978 python3.9[207111]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:56 np0005601978 python3.9[207263]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:19:56 np0005601978 podman[207358]: 2026-01-30 09:19:56.992307144 +0000 UTC m=+0.084452162 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 30 04:19:57 np0005601978 python3.9[207402]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769764796.108817-3886-4408528368827/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:19:57.325 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:19:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:19:57.326 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:19:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:19:57.326 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:19:57 np0005601978 python3.9[207555]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:58 np0005601978 python3.9[207707]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:19:59 np0005601978 python3.9[207785]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:19:59 np0005601978 python3.9[207937]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:20:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:00.302 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:20:00 np0005601978 python3.9[208015]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.3_xnu2t5 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:00.302 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:20:01 np0005601978 python3.9[208167]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:20:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:20:01Z|00080|chassis|WARN|Dropped 1 log messages in last 21 seconds (most recently, 21 seconds ago) due to excessive rate
Jan 30 04:20:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:20:01Z|00081|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:20:01 np0005601978 python3.9[208245]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:02 np0005601978 python3.9[208397]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:20:03 np0005601978 python3[208550]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 30 04:20:03 np0005601978 podman[208674]: 2026-01-30 09:20:03.670866342 +0000 UTC m=+0.092496816 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:20:03 np0005601978 python3.9[208712]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:20:04 np0005601978 python3.9[208801]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:04 np0005601978 python3.9[208953]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:20:05 np0005601978 python3.9[209031]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:05 np0005601978 python3.9[209183]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:20:06 np0005601978 python3.9[209261]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:06 np0005601978 podman[209385]: 2026-01-30 09:20:06.917097107 +0000 UTC m=+0.055651413 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 30 04:20:07 np0005601978 python3.9[209433]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:20:07 np0005601978 python3.9[209512]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:08 np0005601978 python3.9[209664]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 30 04:20:08 np0005601978 python3.9[209789]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769764807.8603046-4261-160326932679530/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:09 np0005601978 python3.9[209941]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:10 np0005601978 python3.9[210093]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:20:10 np0005601978 podman[210121]: 2026-01-30 09:20:10.420123252 +0000 UTC m=+0.074554302 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:20:11 np0005601978 python3.9[210274]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:12 np0005601978 python3.9[210426]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:20:12 np0005601978 python3.9[210579]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 30 04:20:13 np0005601978 python3.9[210733]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 30 04:20:14 np0005601978 python3.9[210888]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 30 04:20:14 np0005601978 systemd[1]: session-27.scope: Deactivated successfully.
Jan 30 04:20:14 np0005601978 systemd[1]: session-27.scope: Consumed 1min 39.161s CPU time.
Jan 30 04:20:14 np0005601978 systemd-logind[793]: Session 27 logged out. Waiting for processes to exit.
Jan 30 04:20:14 np0005601978 systemd-logind[793]: Removed session 27.
Jan 30 04:20:14 np0005601978 podman[210913]: 2026-01-30 09:20:14.805661272 +0000 UTC m=+0.091485680 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.030 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:20:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:17.031 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.782 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.782 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.783 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.783 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.783 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.783 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.783 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.784 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.784 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.807 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.808 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.809 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.809 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.946 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.947 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5931MB free_disk=73.39761734008789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.948 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:20:23 np0005601978 nova_compute[182955]: 2026-01-30 09:20:23.948 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:20:24 np0005601978 nova_compute[182955]: 2026-01-30 09:20:24.001 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:20:24 np0005601978 nova_compute[182955]: 2026-01-30 09:20:24.001 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:20:24 np0005601978 nova_compute[182955]: 2026-01-30 09:20:24.025 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:20:24 np0005601978 nova_compute[182955]: 2026-01-30 09:20:24.041 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:20:24 np0005601978 nova_compute[182955]: 2026-01-30 09:20:24.044 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:20:24 np0005601978 nova_compute[182955]: 2026-01-30 09:20:24.045 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:20:25 np0005601978 nova_compute[182955]: 2026-01-30 09:20:25.040 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:25 np0005601978 podman[210939]: 2026-01-30 09:20:25.431109149 +0000 UTC m=+0.083285123 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, name=ubi9/ubi-minimal)
Jan 30 04:20:25 np0005601978 nova_compute[182955]: 2026-01-30 09:20:25.828 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 44.75 sec#033[00m
Jan 30 04:20:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:25.907 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:20:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:25.908 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:20:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:25.909 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:20:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:25.910 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:20:27 np0005601978 podman[210960]: 2026-01-30 09:20:27.438169255 +0000 UTC m=+0.094368553 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute)
Jan 30 04:20:34 np0005601978 podman[210981]: 2026-01-30 09:20:34.387299132 +0000 UTC m=+0.051461275 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:20:37 np0005601978 podman[211005]: 2026-01-30 09:20:37.405301291 +0000 UTC m=+0.060417153 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:20:41 np0005601978 ovn_controller[95419]: 2026-01-30T09:20:41Z|00082|chassis|WARN|Dropped 4 log messages in last 32 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:20:41 np0005601978 ovn_controller[95419]: 2026-01-30T09:20:41Z|00083|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:20:41 np0005601978 podman[211025]: 2026-01-30 09:20:41.452205464 +0000 UTC m=+0.101076943 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 30 04:20:45 np0005601978 podman[211052]: 2026-01-30 09:20:45.413494978 +0000 UTC m=+0.070792475 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:20:52 np0005601978 nova_compute[182955]: 2026-01-30 09:20:52.429 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:54 np0005601978 nova_compute[182955]: 2026-01-30 09:20:54.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:54 np0005601978 nova_compute[182955]: 2026-01-30 09:20:54.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.434 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.434 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.451 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.452 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.453 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.453 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.475 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.476 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.477 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.477 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.623 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.623 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6009MB free_disk=73.39761734008789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.624 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.624 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.667 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.667 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.683 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.694 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.695 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:20:55 np0005601978 nova_compute[182955]: 2026-01-30 09:20:55.695 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:20:56 np0005601978 podman[211076]: 2026-01-30 09:20:56.457337613 +0000 UTC m=+0.114288501 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, container_name=openstack_network_exporter, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal)
Jan 30 04:20:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:57.017 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:20:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:57.025 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:20:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:57.026 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:20:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:57.027 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:20:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:57.028 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:20:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:57.067 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:20:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:57.327 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:20:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:57.327 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:20:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:20:57.327 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:20:57 np0005601978 nova_compute[182955]: 2026-01-30 09:20:57.675 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:57 np0005601978 nova_compute[182955]: 2026-01-30 09:20:57.676 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:58 np0005601978 podman[211098]: 2026-01-30 09:20:58.413003739 +0000 UTC m=+0.067318882 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 30 04:20:58 np0005601978 nova_compute[182955]: 2026-01-30 09:20:58.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:20:58 np0005601978 nova_compute[182955]: 2026-01-30 09:20:58.434 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:21:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:21:01Z|00084|chassis|WARN|Dropped 1 log messages in last 20 seconds (most recently, 20 seconds ago) due to excessive rate
Jan 30 04:21:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:21:01Z|00085|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:21:05 np0005601978 podman[211118]: 2026-01-30 09:21:05.394664898 +0000 UTC m=+0.048674014 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:21:08 np0005601978 podman[211142]: 2026-01-30 09:21:08.408539539 +0000 UTC m=+0.066389948 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 30 04:21:12 np0005601978 podman[211162]: 2026-01-30 09:21:12.420229572 +0000 UTC m=+0.073970194 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller)
Jan 30 04:21:16 np0005601978 podman[211190]: 2026-01-30 09:21:16.413576716 +0000 UTC m=+0.071780214 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:21:27 np0005601978 podman[211214]: 2026-01-30 09:21:27.416829508 +0000 UTC m=+0.077335917 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, architecture=x86_64, version=9.7, config_id=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, io.openshift.tags=minimal rhel9)
Jan 30 04:21:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:21:27.795 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:21:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:21:27.795 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:21:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:21:27.797 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:21:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:21:27.797 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:21:29 np0005601978 podman[211236]: 2026-01-30 09:21:29.433168214 +0000 UTC m=+0.078849907 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 30 04:21:36 np0005601978 podman[211257]: 2026-01-30 09:21:36.414394452 +0000 UTC m=+0.065782211 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:21:37 np0005601978 nova_compute[182955]: 2026-01-30 09:21:37.258 182959 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [SYS] unknown error (_ssl.c:2501)#033[00m
Jan 30 04:21:39 np0005601978 podman[211281]: 2026-01-30 09:21:39.419306658 +0000 UTC m=+0.068587757 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 30 04:21:42 np0005601978 nova_compute[182955]: 2026-01-30 09:21:42.267 182959 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: timed out (retrying in 1.0 seconds): socket.timeout: timed out#033[00m
Jan 30 04:21:43 np0005601978 ovn_controller[95419]: 2026-01-30T09:21:43Z|00086|chassis|WARN|Dropped 10 log messages in last 36 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:21:43 np0005601978 ovn_controller[95419]: 2026-01-30T09:21:43Z|00087|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:21:43 np0005601978 podman[211301]: 2026-01-30 09:21:43.448308763 +0000 UTC m=+0.107456525 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller)
Jan 30 04:21:47 np0005601978 podman[211328]: 2026-01-30 09:21:47.413391306 +0000 UTC m=+0.072721900 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:21:52 np0005601978 nova_compute[182955]: 2026-01-30 09:21:52.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:21:52 np0005601978 nova_compute[182955]: 2026-01-30 09:21:52.434 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:21:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:21:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:21:57.328 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:21:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:21:57.328 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:21:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:21:57.329 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:21:58 np0005601978 podman[211353]: 2026-01-30 09:21:58.430233091 +0000 UTC m=+0.081488925 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, release=1769056855, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 30 04:21:58 np0005601978 nova_compute[182955]: 2026-01-30 09:21:58.502 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:21:58 np0005601978 nova_compute[182955]: 2026-01-30 09:21:58.502 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:21:58 np0005601978 nova_compute[182955]: 2026-01-30 09:21:58.503 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 30 04:21:58 np0005601978 nova_compute[182955]: 2026-01-30 09:21:58.515 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:21:58 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:21:58.916 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:21:58 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:21:58.952 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:22:00 np0005601978 podman[211375]: 2026-01-30 09:22:00.429976318 +0000 UTC m=+0.086731823 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.060 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 45.23 sec#033[00m
Jan 30 04:22:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:22:01Z|00088|chassis|WARN|Dropped 1 log messages in last 18 seconds (most recently, 18 seconds ago) due to excessive rate
Jan 30 04:22:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:22:01Z|00089|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.524 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.525 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.525 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.525 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.545 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.546 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.547 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.547 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.547 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.548 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.548 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.548 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.549 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.577 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.578 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.578 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.579 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.767 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.768 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6028MB free_disk=73.39761734008789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.768 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.768 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.871 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.872 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.900 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.915 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.917 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:22:01 np0005601978 nova_compute[182955]: 2026-01-30 09:22:01.918 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:22:07 np0005601978 podman[211395]: 2026-01-30 09:22:07.397454257 +0000 UTC m=+0.058500065 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:22:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:22:09.523 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:22:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:22:09.523 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:22:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:22:09.525 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:22:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:22:09.525 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:22:10 np0005601978 podman[211421]: 2026-01-30 09:22:10.419003486 +0000 UTC m=+0.073770086 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 30 04:22:14 np0005601978 ovn_controller[95419]: 2026-01-30T09:22:14Z|00090|chassis|WARN|Dropped 2 log messages in last 5 seconds (most recently, 5 seconds ago) due to excessive rate
Jan 30 04:22:14 np0005601978 ovn_controller[95419]: 2026-01-30T09:22:14Z|00091|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:22:14 np0005601978 podman[211440]: 2026-01-30 09:22:14.444572866 +0000 UTC m=+0.098953239 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:22:18 np0005601978 podman[211467]: 2026-01-30 09:22:18.4020851 +0000 UTC m=+0.065820184 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:22:29 np0005601978 podman[211493]: 2026-01-30 09:22:29.413087136 +0000 UTC m=+0.071425840 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 30 04:22:31 np0005601978 podman[211515]: 2026-01-30 09:22:31.385094209 +0000 UTC m=+0.047905349 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute)
Jan 30 04:22:38 np0005601978 podman[211535]: 2026-01-30 09:22:38.419276483 +0000 UTC m=+0.078689228 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:22:41 np0005601978 podman[211560]: 2026-01-30 09:22:41.402658183 +0000 UTC m=+0.062775129 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 30 04:22:45 np0005601978 ovn_controller[95419]: 2026-01-30T09:22:45Z|00092|chassis|WARN|Dropped 2 log messages in last 31 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:22:45 np0005601978 ovn_controller[95419]: 2026-01-30T09:22:45Z|00093|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:22:45 np0005601978 podman[211581]: 2026-01-30 09:22:45.412437519 +0000 UTC m=+0.074254509 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:22:49 np0005601978 podman[211607]: 2026-01-30 09:22:49.388624946 +0000 UTC m=+0.049552437 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:22:53 np0005601978 nova_compute[182955]: 2026-01-30 09:22:53.822 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:22:56 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:22:56.682 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:22:56 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:22:56.683 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:22:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:22:57.329 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:22:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:22:57.330 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:22:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:22:57.330 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:00 np0005601978 podman[211632]: 2026-01-30 09:23:00.411445632 +0000 UTC m=+0.067893345 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 30 04:23:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:23:01Z|00094|chassis|WARN|Dropped 2 log messages in last 16 seconds (most recently, 16 seconds ago) due to excessive rate
Jan 30 04:23:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:23:01Z|00095|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:23:02 np0005601978 podman[211653]: 2026-01-30 09:23:02.396105487 +0000 UTC m=+0.053068302 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 30 04:23:09 np0005601978 podman[211673]: 2026-01-30 09:23:09.399160789 +0000 UTC m=+0.059889708 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:23:12 np0005601978 nova_compute[182955]: 2026-01-30 09:23:12.236 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Acquiring lock "099fe98a-cfb6-42a3-a483-1e6d9b08c58f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:12 np0005601978 nova_compute[182955]: 2026-01-30 09:23:12.236 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Lock "099fe98a-cfb6-42a3-a483-1e6d9b08c58f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:12 np0005601978 nova_compute[182955]: 2026-01-30 09:23:12.248 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "4d147881-3a87-455b-8fa9-c0e3091974fb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:12 np0005601978 nova_compute[182955]: 2026-01-30 09:23:12.249 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "4d147881-3a87-455b-8fa9-c0e3091974fb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:12 np0005601978 nova_compute[182955]: 2026-01-30 09:23:12.250 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "88b81871-e30c-47d0-972b-3c5ec68db2ca" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:12 np0005601978 nova_compute[182955]: 2026-01-30 09:23:12.250 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "88b81871-e30c-47d0-972b-3c5ec68db2ca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:12 np0005601978 nova_compute[182955]: 2026-01-30 09:23:12.252 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Acquiring lock "bab621cc-63ad-44f1-b991-ea0709fbf1ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:12 np0005601978 nova_compute[182955]: 2026-01-30 09:23:12.253 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Lock "bab621cc-63ad-44f1-b991-ea0709fbf1ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:12 np0005601978 podman[211698]: 2026-01-30 09:23:12.415108833 +0000 UTC m=+0.071457697 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.766 182959 DEBUG nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.768 182959 DEBUG nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.770 182959 DEBUG nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.773 182959 DEBUG nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.786 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.786 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.786 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.927 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.929 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.930 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.931 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.931 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.932 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.933 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.934 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.935 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.963 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.963 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.964 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:13 np0005601978 nova_compute[182955]: 2026-01-30 09:23:13.964 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.015 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.016 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.017 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.024 182959 DEBUG nova.virt.hardware [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.024 182959 INFO nova.compute.claims [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.026 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.041 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.101 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.102 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6058MB free_disk=73.39761734008789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.102 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.110 182959 DEBUG nova.scheduler.client.report [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Refreshing inventories for resource provider 5912bad0-7860-4f37-8078-1db5720295f4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.126 182959 DEBUG nova.scheduler.client.report [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Updating ProviderTree inventory for provider 5912bad0-7860-4f37-8078-1db5720295f4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.127 182959 DEBUG nova.compute.provider_tree [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Updating inventory in ProviderTree for provider 5912bad0-7860-4f37-8078-1db5720295f4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.141 182959 DEBUG nova.scheduler.client.report [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Refreshing aggregate associations for resource provider 5912bad0-7860-4f37-8078-1db5720295f4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.171 182959 DEBUG nova.scheduler.client.report [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Refreshing trait associations for resource provider 5912bad0-7860-4f37-8078-1db5720295f4, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.255 182959 DEBUG nova.compute.provider_tree [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.271 182959 DEBUG nova.scheduler.client.report [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.293 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.295 182959 DEBUG nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.299 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.306 182959 DEBUG nova.virt.hardware [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.307 182959 INFO nova.compute.claims [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.350 182959 DEBUG nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.351 182959 DEBUG nova.network.neutron [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.374 182959 INFO nova.virt.libvirt.driver [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.407 182959 DEBUG nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.527 182959 DEBUG nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.530 182959 DEBUG nova.virt.libvirt.driver [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.531 182959 INFO nova.virt.libvirt.driver [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Creating image(s)#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.532 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Acquiring lock "/var/lib/nova/instances/099fe98a-cfb6-42a3-a483-1e6d9b08c58f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.532 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Lock "/var/lib/nova/instances/099fe98a-cfb6-42a3-a483-1e6d9b08c58f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.533 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Lock "/var/lib/nova/instances/099fe98a-cfb6-42a3-a483-1e6d9b08c58f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.534 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.535 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.544 182959 DEBUG nova.compute.provider_tree [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.570 182959 DEBUG nova.scheduler.client.report [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.619 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.620 182959 DEBUG nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.624 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.598s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.632 182959 DEBUG nova.virt.hardware [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.632 182959 INFO nova.compute.claims [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.699 182959 DEBUG nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.700 182959 DEBUG nova.network.neutron [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.721 182959 INFO nova.virt.libvirt.driver [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.746 182959 DEBUG nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.854 182959 DEBUG nova.compute.provider_tree [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.868 182959 DEBUG nova.scheduler.client.report [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.875 182959 DEBUG nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.876 182959 DEBUG nova.virt.libvirt.driver [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.877 182959 INFO nova.virt.libvirt.driver [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Creating image(s)#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.877 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Acquiring lock "/var/lib/nova/instances/bab621cc-63ad-44f1-b991-ea0709fbf1ab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.878 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Lock "/var/lib/nova/instances/bab621cc-63ad-44f1-b991-ea0709fbf1ab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.878 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Lock "/var/lib/nova/instances/bab621cc-63ad-44f1-b991-ea0709fbf1ab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.879 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.888 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.889 182959 DEBUG nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.891 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.850s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.897 182959 DEBUG nova.virt.hardware [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.897 182959 INFO nova.compute.claims [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.946 182959 DEBUG nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:23:14 np0005601978 nova_compute[182955]: 2026-01-30 09:23:14.947 182959 DEBUG nova.network.neutron [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.062 182959 INFO nova.virt.libvirt.driver [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.079 182959 DEBUG nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.097 182959 WARNING oslo_policy.policy [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.097 182959 WARNING oslo_policy.policy [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.101 182959 DEBUG nova.policy [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c71dc7e6c4c24097bec57442c1a2bfa4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '942f48af96234078956f0ff31f10cb75', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.106 182959 DEBUG nova.policy [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9102293eb6874c798881ad2a64e09228', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '292129f9fc7a469199b7343ecb8146e6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.181 182959 DEBUG nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.184 182959 DEBUG nova.virt.libvirt.driver [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.184 182959 INFO nova.virt.libvirt.driver [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Creating image(s)#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.186 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "/var/lib/nova/instances/88b81871-e30c-47d0-972b-3c5ec68db2ca/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.186 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/88b81871-e30c-47d0-972b-3c5ec68db2ca/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.187 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "/var/lib/nova/instances/88b81871-e30c-47d0-972b-3c5ec68db2ca/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.187 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.193 182959 DEBUG nova.compute.provider_tree [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.195 182959 DEBUG nova.policy [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f469d29ddd6455299c7fb0220c1ffcc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.209 182959 DEBUG nova.scheduler.client.report [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.234 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.343s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.235 182959 DEBUG nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.240 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 1.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.425 182959 DEBUG nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.426 182959 DEBUG nova.network.neutron [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:23:15 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:15.442 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:23:15 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:15.442 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:23:15 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:15.444 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:23:15 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:15.444 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.450 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Instance 099fe98a-cfb6-42a3-a483-1e6d9b08c58f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.451 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Instance 4d147881-3a87-455b-8fa9-c0e3091974fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.451 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Instance 88b81871-e30c-47d0-972b-3c5ec68db2ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.452 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Instance bab621cc-63ad-44f1-b991-ea0709fbf1ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.452 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.452 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.460 182959 INFO nova.virt.libvirt.driver [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.483 182959 DEBUG nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.581 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.603 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.614 182959 DEBUG nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.615 182959 DEBUG nova.virt.libvirt.driver [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.615 182959 INFO nova.virt.libvirt.driver [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Creating image(s)#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.616 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "/var/lib/nova/instances/4d147881-3a87-455b-8fa9-c0e3091974fb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.616 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/4d147881-3a87-455b-8fa9-c0e3091974fb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.617 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "/var/lib/nova/instances/4d147881-3a87-455b-8fa9-c0e3091974fb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.617 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.649 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.650 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:15 np0005601978 nova_compute[182955]: 2026-01-30 09:23:15.983 182959 DEBUG nova.policy [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a6dc933f15eb45099f724c1eb5d822e4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33bba0bc2a744596b558c6598a1970de', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.227 182959 DEBUG nova.network.neutron [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Successfully created port: bc0d9026-8b08-4f36-893e-d2e2df84b25b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.294 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.353 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.part --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.354 182959 DEBUG nova.virt.images [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] ab7cf61b-98df-4a10-83fd-7d23191f2bba was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.355 182959 DEBUG nova.privsep.utils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.356 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.part /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:16 np0005601978 ovn_controller[95419]: 2026-01-30T09:23:16Z|00096|chassis|WARN|Dropped 15 log messages in last 15 seconds (most recently, 11 seconds ago) due to excessive rate
Jan 30 04:23:16 np0005601978 ovn_controller[95419]: 2026-01-30T09:23:16Z|00097|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:23:16 np0005601978 podman[211718]: 2026-01-30 09:23:16.448397926 +0000 UTC m=+0.107317972 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Jan 30 04:23:16 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:16.452 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:23:16 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:16.453 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.613 182959 DEBUG nova.network.neutron [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Successfully created port: 47712b61-9e19-426f-a238-164499a5e96e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.702 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.part /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.converted" returned: 0 in 0.346s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.706 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.775 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4.converted --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.777 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.803 182959 INFO oslo.privsep.daemon [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpn63im_95/privsep.sock']#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.805 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 1.926s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.806 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.825 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 1.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.825 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.847 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 1.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:16 np0005601978 nova_compute[182955]: 2026-01-30 09:23:16.847 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.033 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.034 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.035 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.454 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.455 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.454 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:23:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:17.455 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.463 182959 INFO oslo.privsep.daemon [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.364 211759 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.368 211759 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.371 211759 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.372 211759 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211759#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.469 182959 WARNING oslo_privsep.priv_context [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] privsep daemon already running#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.470 182959 WARNING oslo_privsep.priv_context [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] privsep daemon already running#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.470 182959 WARNING oslo_privsep.priv_context [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] privsep daemon already running#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.476 182959 DEBUG nova.network.neutron [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Successfully created port: fdb27b02-1754-4584-a740-cdc3db8468a1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.544 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.557 182959 DEBUG oslo_concurrency.processutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.570 182959 DEBUG oslo_concurrency.processutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.587 182959 DEBUG oslo_concurrency.processutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.600 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.602 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.603 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.635 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.646 182959 DEBUG oslo_concurrency.processutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.647 182959 DEBUG oslo_concurrency.processutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.647 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.648 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.659 182959 DEBUG oslo_concurrency.processutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.659 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.678 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.678 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/099fe98a-cfb6-42a3-a483-1e6d9b08c58f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.702 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/099fe98a-cfb6-42a3-a483-1e6d9b08c58f/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.703 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.704 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.715 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.068s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.729 182959 DEBUG oslo_concurrency.processutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.761 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.762 182959 DEBUG nova.virt.disk.api [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Checking if we can resize image /var/lib/nova/instances/099fe98a-cfb6-42a3-a483-1e6d9b08c58f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.763 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/099fe98a-cfb6-42a3-a483-1e6d9b08c58f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.774 182959 DEBUG oslo_concurrency.processutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.776 182959 DEBUG oslo_concurrency.processutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/4d147881-3a87-455b-8fa9-c0e3091974fb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.811 182959 DEBUG oslo_concurrency.processutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/4d147881-3a87-455b-8fa9-c0e3091974fb/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.812 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.813 182959 DEBUG oslo_concurrency.processutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.826 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.850 182959 DEBUG oslo_concurrency.processutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.864 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 46.80 sec#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.866 182959 DEBUG oslo_concurrency.processutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/099fe98a-cfb6-42a3-a483-1e6d9b08c58f/disk --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.867 182959 DEBUG nova.virt.disk.api [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Cannot resize image /var/lib/nova/instances/099fe98a-cfb6-42a3-a483-1e6d9b08c58f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.868 182959 DEBUG nova.objects.instance [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Lazy-loading 'migration_context' on Instance uuid 099fe98a-cfb6-42a3-a483-1e6d9b08c58f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.871 182959 DEBUG oslo_concurrency.processutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.871 182959 DEBUG nova.virt.disk.api [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Checking if we can resize image /var/lib/nova/instances/4d147881-3a87-455b-8fa9-c0e3091974fb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.872 182959 DEBUG oslo_concurrency.processutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d147881-3a87-455b-8fa9-c0e3091974fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.905 182959 DEBUG oslo_concurrency.processutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.906 182959 DEBUG oslo_concurrency.processutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/bab621cc-63ad-44f1-b991-ea0709fbf1ab/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.923 182959 DEBUG nova.virt.libvirt.driver [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.924 182959 DEBUG nova.virt.libvirt.driver [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Ensure instance console log exists: /var/lib/nova/instances/099fe98a-cfb6-42a3-a483-1e6d9b08c58f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.924 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.925 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.925 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.927 182959 DEBUG oslo_concurrency.processutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d147881-3a87-455b-8fa9-c0e3091974fb/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.928 182959 DEBUG nova.virt.disk.api [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Cannot resize image /var/lib/nova/instances/4d147881-3a87-455b-8fa9-c0e3091974fb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.929 182959 DEBUG nova.objects.instance [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lazy-loading 'migration_context' on Instance uuid 4d147881-3a87-455b-8fa9-c0e3091974fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.933 182959 DEBUG oslo_concurrency.processutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/bab621cc-63ad-44f1-b991-ea0709fbf1ab/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.934 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.934 182959 DEBUG oslo_concurrency.processutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.948 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.289s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.972 182959 DEBUG oslo_concurrency.processutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.990 182959 DEBUG nova.virt.libvirt.driver [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.991 182959 DEBUG nova.virt.libvirt.driver [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Ensure instance console log exists: /var/lib/nova/instances/4d147881-3a87-455b-8fa9-c0e3091974fb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.991 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.992 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.992 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.993 182959 DEBUG oslo_concurrency.processutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.993 182959 DEBUG nova.virt.disk.api [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Checking if we can resize image /var/lib/nova/instances/bab621cc-63ad-44f1-b991-ea0709fbf1ab/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:23:17 np0005601978 nova_compute[182955]: 2026-01-30 09:23:17.993 182959 DEBUG oslo_concurrency.processutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bab621cc-63ad-44f1-b991-ea0709fbf1ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.034 182959 DEBUG oslo_concurrency.processutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.035 182959 DEBUG oslo_concurrency.processutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/88b81871-e30c-47d0-972b-3c5ec68db2ca/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.044 182959 DEBUG oslo_concurrency.processutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bab621cc-63ad-44f1-b991-ea0709fbf1ab/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.045 182959 DEBUG nova.virt.disk.api [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Cannot resize image /var/lib/nova/instances/bab621cc-63ad-44f1-b991-ea0709fbf1ab/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.045 182959 DEBUG nova.objects.instance [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Lazy-loading 'migration_context' on Instance uuid bab621cc-63ad-44f1-b991-ea0709fbf1ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.061 182959 DEBUG nova.virt.libvirt.driver [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.061 182959 DEBUG nova.virt.libvirt.driver [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Ensure instance console log exists: /var/lib/nova/instances/bab621cc-63ad-44f1-b991-ea0709fbf1ab/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.062 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.062 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.062 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.108 182959 DEBUG oslo_concurrency.processutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/88b81871-e30c-47d0-972b-3c5ec68db2ca/disk 1073741824" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.108 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.109 182959 DEBUG oslo_concurrency.processutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.148 182959 DEBUG oslo_concurrency.processutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.148 182959 DEBUG nova.virt.disk.api [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Checking if we can resize image /var/lib/nova/instances/88b81871-e30c-47d0-972b-3c5ec68db2ca/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.149 182959 DEBUG oslo_concurrency.processutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88b81871-e30c-47d0-972b-3c5ec68db2ca/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.222 182959 DEBUG oslo_concurrency.processutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88b81871-e30c-47d0-972b-3c5ec68db2ca/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.223 182959 DEBUG nova.virt.disk.api [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Cannot resize image /var/lib/nova/instances/88b81871-e30c-47d0-972b-3c5ec68db2ca/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.223 182959 DEBUG nova.objects.instance [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lazy-loading 'migration_context' on Instance uuid 88b81871-e30c-47d0-972b-3c5ec68db2ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.234 182959 DEBUG nova.virt.libvirt.driver [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.235 182959 DEBUG nova.virt.libvirt.driver [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Ensure instance console log exists: /var/lib/nova/instances/88b81871-e30c-47d0-972b-3c5ec68db2ca/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.235 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.235 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.235 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:18 np0005601978 nova_compute[182955]: 2026-01-30 09:23:18.420 182959 DEBUG nova.network.neutron [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Successfully created port: 2804d816-4089-492f-a2cd-754412aade2c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:23:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:19.460 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:23:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:19.460 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:23:20 np0005601978 podman[211824]: 2026-01-30 09:23:20.407325723 +0000 UTC m=+0.065730739 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port bc0d9026-8b08-4f36-893e-d2e2df84b25b, please check neutron logs for more information.
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager Traceback (most recent call last):
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager     nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager     created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager     vif.destroy()
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager     self.force_reraise()
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager     raise self.value
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager     updated_port = self._update_port(
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager     _ensure_no_port_binding_failure(port)
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager     raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port bc0d9026-8b08-4f36-893e-d2e2df84b25b, please check neutron logs for more information.
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.592 182959 ERROR nova.compute.manager #033[00m
Jan 30 04:23:20 np0005601978 nova_compute[182955]: Traceback (most recent call last):
Jan 30 04:23:20 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/eventlet/hubs/poll.py", line 111, in wait
Jan 30 04:23:20 np0005601978 nova_compute[182955]:    listener.cb(fileno)
Jan 30 04:23:20 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 221, in main
Jan 30 04:23:20 np0005601978 nova_compute[182955]:    result = function(*args, **kwargs)
Jan 30 04:23:20 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/utils.py", line 654, in context_wrapper
Jan 30 04:23:20 np0005601978 nova_compute[182955]:    return func(*args, **kwargs)
Jan 30 04:23:20 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1982, in _allocate_network_async
Jan 30 04:23:20 np0005601978 nova_compute[182955]:    raise e
Jan 30 04:23:20 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:20 np0005601978 nova_compute[182955]:    nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:20 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:20 np0005601978 nova_compute[182955]:    created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:20 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:20 np0005601978 nova_compute[182955]:    vif.destroy()
Jan 30 04:23:20 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:20 np0005601978 nova_compute[182955]:    self.force_reraise()
Jan 30 04:23:20 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:20 np0005601978 nova_compute[182955]:    raise self.value
Jan 30 04:23:20 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:20 np0005601978 nova_compute[182955]:    updated_port = self._update_port(
Jan 30 04:23:20 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:20 np0005601978 nova_compute[182955]:    _ensure_no_port_binding_failure(port)
Jan 30 04:23:20 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:20 np0005601978 nova_compute[182955]:    raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:20 np0005601978 nova_compute[182955]: nova.exception.PortBindingFailed: Binding failed for port bc0d9026-8b08-4f36-893e-d2e2df84b25b, please check neutron logs for more information.
Jan 30 04:23:20 np0005601978 nova_compute[182955]: Removing descriptor: 32
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port bc0d9026-8b08-4f36-893e-d2e2df84b25b, please check neutron logs for more information.
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Traceback (most recent call last):
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     yield resources
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     self.driver.spawn(context, instance, image_meta,
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4407, in spawn
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     xml = self._get_guest_xml(context, instance, network_info,
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7538, in _get_guest_xml
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     network_info_str = str(network_info)
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 620, in __str__
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     return self._sync_wrapper(fn, *args, **kwargs)
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 603, in _sync_wrapper
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     self.wait()
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 635, in wait
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     self[:] = self._gt.wait()
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 181, in wait
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     return self._exit_event.wait()
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/eventlet/event.py", line 125, in wait
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     result = hub.switch()
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     return self.greenlet.switch()
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 221, in main
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     result = function(*args, **kwargs)
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/utils.py", line 654, in context_wrapper
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     return func(*args, **kwargs)
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1982, in _allocate_network_async
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     raise e
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     vif.destroy()
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     self.force_reraise()
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     raise self.value
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     updated_port = self._update_port(
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     _ensure_no_port_binding_failure(port)
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] nova.exception.PortBindingFailed: Binding failed for port bc0d9026-8b08-4f36-893e-d2e2df84b25b, please check neutron logs for more information.
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.599 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] #033[00m
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.603 182959 INFO nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Terminating instance#033[00m
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.605 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Acquiring lock "refresh_cache-bab621cc-63ad-44f1-b991-ea0709fbf1ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.606 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Acquired lock "refresh_cache-bab621cc-63ad-44f1-b991-ea0709fbf1ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.606 182959 DEBUG nova.network.neutron [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.914 182959 DEBUG nova.compute.manager [req-ffb4a04d-ae4f-41e2-b5a7-62dd9f892de8 req-9f20ec7c-b818-443b-99bf-8cc17e0261a5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Received event network-changed-bc0d9026-8b08-4f36-893e-d2e2df84b25b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.915 182959 DEBUG nova.compute.manager [req-ffb4a04d-ae4f-41e2-b5a7-62dd9f892de8 req-9f20ec7c-b818-443b-99bf-8cc17e0261a5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Refreshing instance network info cache due to event network-changed-bc0d9026-8b08-4f36-893e-d2e2df84b25b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:23:20 np0005601978 nova_compute[182955]: 2026-01-30 09:23:20.916 182959 DEBUG oslo_concurrency.lockutils [req-ffb4a04d-ae4f-41e2-b5a7-62dd9f892de8 req-9f20ec7c-b818-443b-99bf-8cc17e0261a5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-bab621cc-63ad-44f1-b991-ea0709fbf1ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 47712b61-9e19-426f-a238-164499a5e96e, please check neutron logs for more information.
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager Traceback (most recent call last):
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager     nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager     created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager     vif.destroy()
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager     self.force_reraise()
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager     raise self.value
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager     updated_port = self._update_port(
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager     _ensure_no_port_binding_failure(port)
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager     raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 47712b61-9e19-426f-a238-164499a5e96e, please check neutron logs for more information.
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.113 182959 ERROR nova.compute.manager #033[00m
Jan 30 04:23:21 np0005601978 nova_compute[182955]: Traceback (most recent call last):
Jan 30 04:23:21 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/eventlet/hubs/poll.py", line 111, in wait
Jan 30 04:23:21 np0005601978 nova_compute[182955]:    listener.cb(fileno)
Jan 30 04:23:21 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 221, in main
Jan 30 04:23:21 np0005601978 nova_compute[182955]:    result = function(*args, **kwargs)
Jan 30 04:23:21 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/utils.py", line 654, in context_wrapper
Jan 30 04:23:21 np0005601978 nova_compute[182955]:    return func(*args, **kwargs)
Jan 30 04:23:21 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1982, in _allocate_network_async
Jan 30 04:23:21 np0005601978 nova_compute[182955]:    raise e
Jan 30 04:23:21 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:21 np0005601978 nova_compute[182955]:    nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:21 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:21 np0005601978 nova_compute[182955]:    created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:21 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:21 np0005601978 nova_compute[182955]:    vif.destroy()
Jan 30 04:23:21 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:21 np0005601978 nova_compute[182955]:    self.force_reraise()
Jan 30 04:23:21 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:21 np0005601978 nova_compute[182955]:    raise self.value
Jan 30 04:23:21 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:21 np0005601978 nova_compute[182955]:    updated_port = self._update_port(
Jan 30 04:23:21 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:21 np0005601978 nova_compute[182955]:    _ensure_no_port_binding_failure(port)
Jan 30 04:23:21 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:21 np0005601978 nova_compute[182955]:    raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:21 np0005601978 nova_compute[182955]: nova.exception.PortBindingFailed: Binding failed for port 47712b61-9e19-426f-a238-164499a5e96e, please check neutron logs for more information.
Jan 30 04:23:21 np0005601978 nova_compute[182955]: Removing descriptor: 30
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 47712b61-9e19-426f-a238-164499a5e96e, please check neutron logs for more information.
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Traceback (most recent call last):
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     yield resources
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     self.driver.spawn(context, instance, image_meta,
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4407, in spawn
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     xml = self._get_guest_xml(context, instance, network_info,
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7538, in _get_guest_xml
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     network_info_str = str(network_info)
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 620, in __str__
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     return self._sync_wrapper(fn, *args, **kwargs)
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 603, in _sync_wrapper
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     self.wait()
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 635, in wait
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     self[:] = self._gt.wait()
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 181, in wait
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     return self._exit_event.wait()
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/eventlet/event.py", line 125, in wait
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     result = hub.switch()
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     return self.greenlet.switch()
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 221, in main
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     result = function(*args, **kwargs)
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/utils.py", line 654, in context_wrapper
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     return func(*args, **kwargs)
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1982, in _allocate_network_async
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     raise e
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     vif.destroy()
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     self.force_reraise()
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     raise self.value
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     updated_port = self._update_port(
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     _ensure_no_port_binding_failure(port)
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] nova.exception.PortBindingFailed: Binding failed for port 47712b61-9e19-426f-a238-164499a5e96e, please check neutron logs for more information.
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.115 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] #033[00m
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.117 182959 INFO nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Terminating instance#033[00m
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.118 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Acquiring lock "refresh_cache-099fe98a-cfb6-42a3-a483-1e6d9b08c58f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.119 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Acquired lock "refresh_cache-099fe98a-cfb6-42a3-a483-1e6d9b08c58f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.119 182959 DEBUG nova.network.neutron [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.121 182959 DEBUG nova.network.neutron [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.257 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.447 182959 DEBUG nova.network.neutron [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:21.462 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:23:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:21.462 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:23:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:21.462 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:23:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:21.463 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:23:21 np0005601978 nova_compute[182955]: 2026-01-30 09:23:21.993 182959 DEBUG nova.network.neutron [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.053 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Releasing lock "refresh_cache-bab621cc-63ad-44f1-b991-ea0709fbf1ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.055 182959 DEBUG nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.056 182959 DEBUG oslo_concurrency.lockutils [req-ffb4a04d-ae4f-41e2-b5a7-62dd9f892de8 req-9f20ec7c-b818-443b-99bf-8cc17e0261a5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-bab621cc-63ad-44f1-b991-ea0709fbf1ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.056 182959 DEBUG nova.network.neutron [req-ffb4a04d-ae4f-41e2-b5a7-62dd9f892de8 req-9f20ec7c-b818-443b-99bf-8cc17e0261a5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Refreshing network info cache for port bc0d9026-8b08-4f36-893e-d2e2df84b25b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.062 182959 DEBUG nova.virt.libvirt.driver [-] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.063 182959 INFO nova.virt.libvirt.driver [-] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Instance destroyed successfully.#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.064 182959 INFO nova.virt.libvirt.driver [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Deleting instance files /var/lib/nova/instances/bab621cc-63ad-44f1-b991-ea0709fbf1ab_del#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.064 182959 INFO nova.virt.libvirt.driver [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Deletion of /var/lib/nova/instances/bab621cc-63ad-44f1-b991-ea0709fbf1ab_del complete#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port fdb27b02-1754-4584-a740-cdc3db8468a1, please check neutron logs for more information.
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager Traceback (most recent call last):
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager     nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager     created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager     vif.destroy()
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager     self.force_reraise()
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager     raise self.value
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager     updated_port = self._update_port(
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager     _ensure_no_port_binding_failure(port)
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager     raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port fdb27b02-1754-4584-a740-cdc3db8468a1, please check neutron logs for more information.
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.184 182959 ERROR nova.compute.manager #033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: Traceback (most recent call last):
Jan 30 04:23:22 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/eventlet/hubs/poll.py", line 111, in wait
Jan 30 04:23:22 np0005601978 nova_compute[182955]:    listener.cb(fileno)
Jan 30 04:23:22 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 221, in main
Jan 30 04:23:22 np0005601978 nova_compute[182955]:    result = function(*args, **kwargs)
Jan 30 04:23:22 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/utils.py", line 654, in context_wrapper
Jan 30 04:23:22 np0005601978 nova_compute[182955]:    return func(*args, **kwargs)
Jan 30 04:23:22 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1982, in _allocate_network_async
Jan 30 04:23:22 np0005601978 nova_compute[182955]:    raise e
Jan 30 04:23:22 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:22 np0005601978 nova_compute[182955]:    nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:22 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:22 np0005601978 nova_compute[182955]:    created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:22 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:22 np0005601978 nova_compute[182955]:    vif.destroy()
Jan 30 04:23:22 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:22 np0005601978 nova_compute[182955]:    self.force_reraise()
Jan 30 04:23:22 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:22 np0005601978 nova_compute[182955]:    raise self.value
Jan 30 04:23:22 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:22 np0005601978 nova_compute[182955]:    updated_port = self._update_port(
Jan 30 04:23:22 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:22 np0005601978 nova_compute[182955]:    _ensure_no_port_binding_failure(port)
Jan 30 04:23:22 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:22 np0005601978 nova_compute[182955]:    raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:22 np0005601978 nova_compute[182955]: nova.exception.PortBindingFailed: Binding failed for port fdb27b02-1754-4584-a740-cdc3db8468a1, please check neutron logs for more information.
Jan 30 04:23:22 np0005601978 nova_compute[182955]: Removing descriptor: 33
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port fdb27b02-1754-4584-a740-cdc3db8468a1, please check neutron logs for more information.
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Traceback (most recent call last):
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     yield resources
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     self.driver.spawn(context, instance, image_meta,
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4407, in spawn
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     xml = self._get_guest_xml(context, instance, network_info,
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7538, in _get_guest_xml
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     network_info_str = str(network_info)
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 620, in __str__
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     return self._sync_wrapper(fn, *args, **kwargs)
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 603, in _sync_wrapper
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     self.wait()
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 635, in wait
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     self[:] = self._gt.wait()
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 181, in wait
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     return self._exit_event.wait()
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/eventlet/event.py", line 125, in wait
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     result = hub.switch()
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     return self.greenlet.switch()
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 221, in main
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     result = function(*args, **kwargs)
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/utils.py", line 654, in context_wrapper
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     return func(*args, **kwargs)
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1982, in _allocate_network_async
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     raise e
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     vif.destroy()
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     self.force_reraise()
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     raise self.value
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     updated_port = self._update_port(
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     _ensure_no_port_binding_failure(port)
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] nova.exception.PortBindingFailed: Binding failed for port fdb27b02-1754-4584-a740-cdc3db8468a1, please check neutron logs for more information.
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.186 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] #033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.188 182959 INFO nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Terminating instance#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.190 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "refresh_cache-88b81871-e30c-47d0-972b-3c5ec68db2ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.190 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquired lock "refresh_cache-88b81871-e30c-47d0-972b-3c5ec68db2ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.191 182959 DEBUG nova.network.neutron [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.207 182959 INFO nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Took 0.15 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.208 182959 DEBUG oslo.service.loopingcall [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.209 182959 DEBUG nova.compute.manager [-] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.210 182959 DEBUG nova.network.neutron [-] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.285 182959 DEBUG nova.compute.manager [req-21b9e273-48c3-4af1-b60a-554423a753bb req-6cd275d2-b44f-464d-9d4b-c0cd11a33cce dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Received event network-vif-deleted-bc0d9026-8b08-4f36-893e-d2e2df84b25b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.457 182959 DEBUG nova.network.neutron [req-ffb4a04d-ae4f-41e2-b5a7-62dd9f892de8 req-9f20ec7c-b818-443b-99bf-8cc17e0261a5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.541 182959 DEBUG nova.network.neutron [-] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.566 182959 DEBUG nova.network.neutron [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.599 182959 DEBUG nova.network.neutron [-] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.600 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Releasing lock "refresh_cache-099fe98a-cfb6-42a3-a483-1e6d9b08c58f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.601 182959 DEBUG nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.607 182959 DEBUG nova.virt.libvirt.driver [-] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.608 182959 INFO nova.virt.libvirt.driver [-] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Instance destroyed successfully.#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.608 182959 INFO nova.virt.libvirt.driver [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Deleting instance files /var/lib/nova/instances/099fe98a-cfb6-42a3-a483-1e6d9b08c58f_del#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.609 182959 INFO nova.virt.libvirt.driver [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Deletion of /var/lib/nova/instances/099fe98a-cfb6-42a3-a483-1e6d9b08c58f_del complete#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.658 182959 DEBUG nova.network.neutron [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.675 182959 INFO nova.compute.manager [-] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Took 0.47 seconds to deallocate network for instance.#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.677 182959 DEBUG nova.compute.claims [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Aborting claim: <nova.compute.claims.Claim object at 0x7f6100370370> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.678 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.679 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.736 182959 INFO nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Took 0.13 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.737 182959 DEBUG oslo.service.loopingcall [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.738 182959 DEBUG nova.compute.manager [-] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.738 182959 DEBUG nova.network.neutron [-] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:23:22 np0005601978 nova_compute[182955]: 2026-01-30 09:23:22.938 182959 DEBUG nova.compute.provider_tree [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Updating inventory in ProviderTree for provider 5912bad0-7860-4f37-8078-1db5720295f4 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.022 182959 DEBUG nova.scheduler.client.report [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Updated inventory for provider 5912bad0-7860-4f37-8078-1db5720295f4 with generation 6 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.023 182959 DEBUG nova.compute.provider_tree [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Updating resource provider 5912bad0-7860-4f37-8078-1db5720295f4 generation from 6 to 7 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.024 182959 DEBUG nova.compute.provider_tree [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Updating inventory in ProviderTree for provider 5912bad0-7860-4f37-8078-1db5720295f4 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.031 182959 DEBUG nova.network.neutron [-] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.063 182959 DEBUG nova.network.neutron [-] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.078 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.399s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port bc0d9026-8b08-4f36-893e-d2e2df84b25b, please check neutron logs for more information.
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Traceback (most recent call last):
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     self.driver.spawn(context, instance, image_meta,
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4407, in spawn
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     xml = self._get_guest_xml(context, instance, network_info,
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7538, in _get_guest_xml
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     network_info_str = str(network_info)
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 620, in __str__
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     return self._sync_wrapper(fn, *args, **kwargs)
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 603, in _sync_wrapper
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     self.wait()
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 635, in wait
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     self[:] = self._gt.wait()
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 181, in wait
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     return self._exit_event.wait()
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/eventlet/event.py", line 125, in wait
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     result = hub.switch()
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     return self.greenlet.switch()
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 221, in main
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     result = function(*args, **kwargs)
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/utils.py", line 654, in context_wrapper
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     return func(*args, **kwargs)
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1982, in _allocate_network_async
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     raise e
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     vif.destroy()
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     self.force_reraise()
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     raise self.value
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     updated_port = self._update_port(
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     _ensure_no_port_binding_failure(port)
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab]     raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] nova.exception.PortBindingFailed: Binding failed for port bc0d9026-8b08-4f36-893e-d2e2df84b25b, please check neutron logs for more information.
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.079 182959 ERROR nova.compute.manager [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] #033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.081 182959 DEBUG nova.compute.utils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Binding failed for port bc0d9026-8b08-4f36-893e-d2e2df84b25b, please check neutron logs for more information. notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.082 182959 DEBUG nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Build of instance bab621cc-63ad-44f1-b991-ea0709fbf1ab was re-scheduled: Binding failed for port bc0d9026-8b08-4f36-893e-d2e2df84b25b, please check neutron logs for more information. _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2450#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.083 182959 DEBUG nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.084 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Acquiring lock "refresh_cache-bab621cc-63ad-44f1-b991-ea0709fbf1ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.103 182959 INFO nova.compute.manager [-] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Took 0.36 seconds to deallocate network for instance.#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.105 182959 DEBUG nova.compute.claims [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Aborting claim: <nova.compute.claims.Claim object at 0x7f610037ba00> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.106 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.106 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.133 182959 DEBUG nova.compute.manager [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Received event network-changed-47712b61-9e19-426f-a238-164499a5e96e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.133 182959 DEBUG nova.compute.manager [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Refreshing instance network info cache due to event network-changed-47712b61-9e19-426f-a238-164499a5e96e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.133 182959 DEBUG oslo_concurrency.lockutils [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-099fe98a-cfb6-42a3-a483-1e6d9b08c58f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.134 182959 DEBUG oslo_concurrency.lockutils [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-099fe98a-cfb6-42a3-a483-1e6d9b08c58f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.134 182959 DEBUG nova.network.neutron [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Refreshing network info cache for port 47712b61-9e19-426f-a238-164499a5e96e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.285 182959 DEBUG nova.network.neutron [req-ffb4a04d-ae4f-41e2-b5a7-62dd9f892de8 req-9f20ec7c-b818-443b-99bf-8cc17e0261a5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.300 182959 DEBUG nova.compute.provider_tree [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.305 182959 DEBUG oslo_concurrency.lockutils [req-ffb4a04d-ae4f-41e2-b5a7-62dd9f892de8 req-9f20ec7c-b818-443b-99bf-8cc17e0261a5 dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-bab621cc-63ad-44f1-b991-ea0709fbf1ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.306 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Acquired lock "refresh_cache-bab621cc-63ad-44f1-b991-ea0709fbf1ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.306 182959 DEBUG nova.network.neutron [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.320 182959 DEBUG nova.network.neutron [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.334 182959 DEBUG nova.scheduler.client.report [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 47712b61-9e19-426f-a238-164499a5e96e, please check neutron logs for more information.
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Traceback (most recent call last):
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     self.driver.spawn(context, instance, image_meta,
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4407, in spawn
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     xml = self._get_guest_xml(context, instance, network_info,
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7538, in _get_guest_xml
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     network_info_str = str(network_info)
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 620, in __str__
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     return self._sync_wrapper(fn, *args, **kwargs)
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 603, in _sync_wrapper
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     self.wait()
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 635, in wait
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     self[:] = self._gt.wait()
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 181, in wait
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     return self._exit_event.wait()
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/eventlet/event.py", line 125, in wait
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     result = hub.switch()
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     return self.greenlet.switch()
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 221, in main
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     result = function(*args, **kwargs)
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/utils.py", line 654, in context_wrapper
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     return func(*args, **kwargs)
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1982, in _allocate_network_async
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     raise e
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     vif.destroy()
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     self.force_reraise()
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     raise self.value
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     updated_port = self._update_port(
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     _ensure_no_port_binding_failure(port)
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f]     raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] nova.exception.PortBindingFailed: Binding failed for port 47712b61-9e19-426f-a238-164499a5e96e, please check neutron logs for more information.
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.368 182959 ERROR nova.compute.manager [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] #033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.369 182959 DEBUG nova.compute.utils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Binding failed for port 47712b61-9e19-426f-a238-164499a5e96e, please check neutron logs for more information. notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.370 182959 DEBUG nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Build of instance 099fe98a-cfb6-42a3-a483-1e6d9b08c58f was re-scheduled: Binding failed for port 47712b61-9e19-426f-a238-164499a5e96e, please check neutron logs for more information. _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2450#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.370 182959 DEBUG nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.370 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Acquiring lock "refresh_cache-099fe98a-cfb6-42a3-a483-1e6d9b08c58f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.488 182959 DEBUG nova.network.neutron [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.519 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Releasing lock "refresh_cache-88b81871-e30c-47d0-972b-3c5ec68db2ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.520 182959 DEBUG nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.524 182959 DEBUG nova.virt.libvirt.driver [-] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.524 182959 INFO nova.virt.libvirt.driver [-] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Instance destroyed successfully.#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.525 182959 INFO nova.virt.libvirt.driver [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Deleting instance files /var/lib/nova/instances/88b81871-e30c-47d0-972b-3c5ec68db2ca_del#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.525 182959 INFO nova.virt.libvirt.driver [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Deletion of /var/lib/nova/instances/88b81871-e30c-47d0-972b-3c5ec68db2ca_del complete#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.561 182959 DEBUG nova.network.neutron [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.617 182959 INFO nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Took 0.10 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.617 182959 DEBUG oslo.service.loopingcall [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.618 182959 DEBUG nova.compute.manager [-] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.619 182959 DEBUG nova.network.neutron [-] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.718 182959 DEBUG nova.network.neutron [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.868 182959 DEBUG oslo_concurrency.lockutils [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-099fe98a-cfb6-42a3-a483-1e6d9b08c58f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.869 182959 DEBUG nova.compute.manager [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Received event network-changed-fdb27b02-1754-4584-a740-cdc3db8468a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.869 182959 DEBUG nova.compute.manager [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Refreshing instance network info cache due to event network-changed-fdb27b02-1754-4584-a740-cdc3db8468a1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.869 182959 DEBUG oslo_concurrency.lockutils [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-88b81871-e30c-47d0-972b-3c5ec68db2ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.869 182959 DEBUG oslo_concurrency.lockutils [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-88b81871-e30c-47d0-972b-3c5ec68db2ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.869 182959 DEBUG nova.network.neutron [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Refreshing network info cache for port fdb27b02-1754-4584-a740-cdc3db8468a1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.870 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Acquired lock "refresh_cache-099fe98a-cfb6-42a3-a483-1e6d9b08c58f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:23:23 np0005601978 nova_compute[182955]: 2026-01-30 09:23:23.870 182959 DEBUG nova.network.neutron [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.142 182959 DEBUG nova.network.neutron [-] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 2804d816-4089-492f-a2cd-754412aade2c, please check neutron logs for more information.
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager Traceback (most recent call last):
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager     nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager     created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager     vif.destroy()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager     self.force_reraise()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager     raise self.value
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager     updated_port = self._update_port(
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager     _ensure_no_port_binding_failure(port)
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager     raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 2804d816-4089-492f-a2cd-754412aade2c, please check neutron logs for more information.
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.149 182959 ERROR nova.compute.manager #033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: Traceback (most recent call last):
Jan 30 04:23:24 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/eventlet/hubs/poll.py", line 111, in wait
Jan 30 04:23:24 np0005601978 nova_compute[182955]:    listener.cb(fileno)
Jan 30 04:23:24 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 221, in main
Jan 30 04:23:24 np0005601978 nova_compute[182955]:    result = function(*args, **kwargs)
Jan 30 04:23:24 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/utils.py", line 654, in context_wrapper
Jan 30 04:23:24 np0005601978 nova_compute[182955]:    return func(*args, **kwargs)
Jan 30 04:23:24 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1982, in _allocate_network_async
Jan 30 04:23:24 np0005601978 nova_compute[182955]:    raise e
Jan 30 04:23:24 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:24 np0005601978 nova_compute[182955]:    nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:24 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:24 np0005601978 nova_compute[182955]:    created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:24 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:24 np0005601978 nova_compute[182955]:    vif.destroy()
Jan 30 04:23:24 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:24 np0005601978 nova_compute[182955]:    self.force_reraise()
Jan 30 04:23:24 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:24 np0005601978 nova_compute[182955]:    raise self.value
Jan 30 04:23:24 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:24 np0005601978 nova_compute[182955]:    updated_port = self._update_port(
Jan 30 04:23:24 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:24 np0005601978 nova_compute[182955]:    _ensure_no_port_binding_failure(port)
Jan 30 04:23:24 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:24 np0005601978 nova_compute[182955]:    raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:24 np0005601978 nova_compute[182955]: nova.exception.PortBindingFailed: Binding failed for port 2804d816-4089-492f-a2cd-754412aade2c, please check neutron logs for more information.
Jan 30 04:23:24 np0005601978 nova_compute[182955]: Removing descriptor: 34
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 2804d816-4089-492f-a2cd-754412aade2c, please check neutron logs for more information.
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Traceback (most recent call last):
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     yield resources
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     self.driver.spawn(context, instance, image_meta,
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4407, in spawn
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     xml = self._get_guest_xml(context, instance, network_info,
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7538, in _get_guest_xml
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     network_info_str = str(network_info)
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 620, in __str__
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     return self._sync_wrapper(fn, *args, **kwargs)
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 603, in _sync_wrapper
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     self.wait()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 635, in wait
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     self[:] = self._gt.wait()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 181, in wait
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     return self._exit_event.wait()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/eventlet/event.py", line 125, in wait
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     result = hub.switch()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     return self.greenlet.switch()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 221, in main
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     result = function(*args, **kwargs)
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/utils.py", line 654, in context_wrapper
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     return func(*args, **kwargs)
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1982, in _allocate_network_async
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     raise e
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     vif.destroy()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     self.force_reraise()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     raise self.value
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     updated_port = self._update_port(
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     _ensure_no_port_binding_failure(port)
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] nova.exception.PortBindingFailed: Binding failed for port 2804d816-4089-492f-a2cd-754412aade2c, please check neutron logs for more information.
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.150 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] #033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.151 182959 INFO nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Terminating instance#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.152 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "refresh_cache-4d147881-3a87-455b-8fa9-c0e3091974fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.153 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquired lock "refresh_cache-4d147881-3a87-455b-8fa9-c0e3091974fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.153 182959 DEBUG nova.network.neutron [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.206 182959 DEBUG nova.network.neutron [-] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.213 182959 DEBUG nova.network.neutron [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.231 182959 DEBUG nova.network.neutron [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.239 182959 DEBUG nova.network.neutron [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.268 182959 INFO nova.compute.manager [-] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Took 0.65 seconds to deallocate network for instance.#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.269 182959 DEBUG nova.compute.claims [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Aborting claim: <nova.compute.claims.Claim object at 0x7f6100201fd0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.269 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.269 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.273 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Releasing lock "refresh_cache-bab621cc-63ad-44f1-b991-ea0709fbf1ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.273 182959 DEBUG nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.273 182959 DEBUG nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.273 182959 DEBUG nova.network.neutron [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.389 182959 DEBUG nova.network.neutron [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.429 182959 DEBUG nova.compute.provider_tree [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.435 182959 DEBUG nova.network.neutron [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.457 182959 DEBUG nova.network.neutron [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.460 182959 DEBUG nova.scheduler.client.report [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.484 182959 INFO nova.compute.manager [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] [instance: bab621cc-63ad-44f1-b991-ea0709fbf1ab] Took 0.21 seconds to deallocate network for instance.#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port fdb27b02-1754-4584-a740-cdc3db8468a1, please check neutron logs for more information.
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Traceback (most recent call last):
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     self.driver.spawn(context, instance, image_meta,
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4407, in spawn
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     xml = self._get_guest_xml(context, instance, network_info,
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7538, in _get_guest_xml
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     network_info_str = str(network_info)
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 620, in __str__
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     return self._sync_wrapper(fn, *args, **kwargs)
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 603, in _sync_wrapper
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     self.wait()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 635, in wait
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     self[:] = self._gt.wait()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 181, in wait
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     return self._exit_event.wait()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/eventlet/event.py", line 125, in wait
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     result = hub.switch()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     return self.greenlet.switch()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 221, in main
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     result = function(*args, **kwargs)
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/utils.py", line 654, in context_wrapper
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     return func(*args, **kwargs)
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1982, in _allocate_network_async
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     raise e
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     vif.destroy()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     self.force_reraise()
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     raise self.value
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     updated_port = self._update_port(
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     _ensure_no_port_binding_failure(port)
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca]     raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] nova.exception.PortBindingFailed: Binding failed for port fdb27b02-1754-4584-a740-cdc3db8468a1, please check neutron logs for more information.
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.489 182959 ERROR nova.compute.manager [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] #033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.491 182959 DEBUG nova.compute.utils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Binding failed for port fdb27b02-1754-4584-a740-cdc3db8468a1, please check neutron logs for more information. notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.491 182959 DEBUG nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Build of instance 88b81871-e30c-47d0-972b-3c5ec68db2ca was re-scheduled: Binding failed for port fdb27b02-1754-4584-a740-cdc3db8468a1, please check neutron logs for more information. _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2450#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.492 182959 DEBUG nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.492 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquiring lock "refresh_cache-88b81871-e30c-47d0-972b-3c5ec68db2ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.588 182959 DEBUG nova.network.neutron [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.609 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Releasing lock "refresh_cache-099fe98a-cfb6-42a3-a483-1e6d9b08c58f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.610 182959 DEBUG nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.610 182959 DEBUG nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.610 182959 DEBUG nova.network.neutron [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.626 182959 INFO nova.scheduler.client.report [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Deleted allocations for instance bab621cc-63ad-44f1-b991-ea0709fbf1ab#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.668 182959 DEBUG nova.network.neutron [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.677 182959 DEBUG oslo_concurrency.lockutils [None req-5b5720b0-51c9-4e24-8a16-89a9ca888f3c 9102293eb6874c798881ad2a64e09228 292129f9fc7a469199b7343ecb8146e6 - - default default] Lock "bab621cc-63ad-44f1-b991-ea0709fbf1ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.424s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.696 182959 DEBUG oslo_concurrency.lockutils [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-88b81871-e30c-47d0-972b-3c5ec68db2ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.697 182959 DEBUG nova.compute.manager [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Received event network-vif-deleted-47712b61-9e19-426f-a238-164499a5e96e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.697 182959 DEBUG nova.compute.manager [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Received event network-vif-deleted-fdb27b02-1754-4584-a740-cdc3db8468a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.698 182959 DEBUG nova.compute.manager [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Received event network-changed-2804d816-4089-492f-a2cd-754412aade2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.698 182959 DEBUG nova.compute.manager [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Refreshing instance network info cache due to event network-changed-2804d816-4089-492f-a2cd-754412aade2c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.698 182959 DEBUG oslo_concurrency.lockutils [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-4d147881-3a87-455b-8fa9-c0e3091974fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.699 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Acquired lock "refresh_cache-88b81871-e30c-47d0-972b-3c5ec68db2ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.699 182959 DEBUG nova.network.neutron [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.806 182959 DEBUG nova.network.neutron [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.819 182959 DEBUG nova.network.neutron [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.834 182959 INFO nova.compute.manager [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] [instance: 099fe98a-cfb6-42a3-a483-1e6d9b08c58f] Took 0.22 seconds to deallocate network for instance.#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.872 182959 DEBUG nova.network.neutron [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.898 182959 DEBUG nova.network.neutron [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.916 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Releasing lock "refresh_cache-4d147881-3a87-455b-8fa9-c0e3091974fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.916 182959 DEBUG nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.916 182959 DEBUG oslo_concurrency.lockutils [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-4d147881-3a87-455b-8fa9-c0e3091974fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.917 182959 DEBUG nova.network.neutron [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Refreshing network info cache for port 2804d816-4089-492f-a2cd-754412aade2c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.921 182959 DEBUG nova.virt.libvirt.driver [-] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.921 182959 INFO nova.virt.libvirt.driver [-] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Instance destroyed successfully.#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.921 182959 INFO nova.virt.libvirt.driver [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Deleting instance files /var/lib/nova/instances/4d147881-3a87-455b-8fa9-c0e3091974fb_del#033[00m
Jan 30 04:23:24 np0005601978 nova_compute[182955]: 2026-01-30 09:23:24.922 182959 INFO nova.virt.libvirt.driver [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Deletion of /var/lib/nova/instances/4d147881-3a87-455b-8fa9-c0e3091974fb_del complete#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.038 182959 INFO nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Took 0.12 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.039 182959 DEBUG oslo.service.loopingcall [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.040 182959 DEBUG nova.compute.manager [-] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.040 182959 DEBUG nova.network.neutron [-] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.073 182959 INFO nova.scheduler.client.report [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Deleted allocations for instance 099fe98a-cfb6-42a3-a483-1e6d9b08c58f#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.108 182959 DEBUG nova.network.neutron [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.134 182959 DEBUG oslo_concurrency.lockutils [None req-73c05655-5368-4b49-99f2-1ac672c58735 c71dc7e6c4c24097bec57442c1a2bfa4 942f48af96234078956f0ff31f10cb75 - - default default] Lock "099fe98a-cfb6-42a3-a483-1e6d9b08c58f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.897s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.182 182959 DEBUG nova.network.neutron [-] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.219 182959 DEBUG nova.network.neutron [-] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.243 182959 DEBUG nova.compute.manager [req-80dcee4f-d0c3-4317-b1e1-4c2618eef3a3 req-5d8772e5-fa3e-40af-b388-d1178063122c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Received event network-vif-deleted-2804d816-4089-492f-a2cd-754412aade2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.248 182959 INFO nova.compute.manager [-] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Took 0.21 seconds to deallocate network for instance.#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.249 182959 DEBUG nova.compute.claims [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Aborting claim: <nova.compute.claims.Claim object at 0x7f61002a18b0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.250 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.250 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.342 182959 DEBUG nova.network.neutron [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.363 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Releasing lock "refresh_cache-88b81871-e30c-47d0-972b-3c5ec68db2ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.364 182959 DEBUG nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.364 182959 DEBUG nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.365 182959 DEBUG nova.network.neutron [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.377 182959 DEBUG nova.compute.provider_tree [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.395 182959 DEBUG nova.scheduler.client.report [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 2804d816-4089-492f-a2cd-754412aade2c, please check neutron logs for more information.
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Traceback (most recent call last):
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     self.driver.spawn(context, instance, image_meta,
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4407, in spawn
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     xml = self._get_guest_xml(context, instance, network_info,
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7538, in _get_guest_xml
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     network_info_str = str(network_info)
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 620, in __str__
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     return self._sync_wrapper(fn, *args, **kwargs)
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 603, in _sync_wrapper
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     self.wait()
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 635, in wait
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     self[:] = self._gt.wait()
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 181, in wait
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     return self._exit_event.wait()
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/eventlet/event.py", line 125, in wait
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     result = hub.switch()
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     return self.greenlet.switch()
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 221, in main
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     result = function(*args, **kwargs)
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/utils.py", line 654, in context_wrapper
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     return func(*args, **kwargs)
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1982, in _allocate_network_async
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     raise e
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     created_port_ids = self._update_ports_for_instance(
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     vif.destroy()
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     self.force_reraise()
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     raise self.value
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     updated_port = self._update_port(
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     _ensure_no_port_binding_failure(port)
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb]     raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] nova.exception.PortBindingFailed: Binding failed for port 2804d816-4089-492f-a2cd-754412aade2c, please check neutron logs for more information.
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.433 182959 ERROR nova.compute.manager [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] #033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.435 182959 DEBUG nova.compute.utils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Binding failed for port 2804d816-4089-492f-a2cd-754412aade2c, please check neutron logs for more information. notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.437 182959 DEBUG nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Build of instance 4d147881-3a87-455b-8fa9-c0e3091974fb was re-scheduled: Binding failed for port 2804d816-4089-492f-a2cd-754412aade2c, please check neutron logs for more information. _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2450#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.438 182959 DEBUG nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.438 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquiring lock "refresh_cache-4d147881-3a87-455b-8fa9-c0e3091974fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:23:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:25.470 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:23:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:25.470 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.572 182959 DEBUG nova.network.neutron [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.598 182959 DEBUG nova.network.neutron [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.615 182959 INFO nova.compute.manager [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] [instance: 88b81871-e30c-47d0-972b-3c5ec68db2ca] Took 0.25 seconds to deallocate network for instance.#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.635 182959 DEBUG nova.network.neutron [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.666 182959 DEBUG oslo_concurrency.lockutils [req-440c0215-52ed-46b2-a374-4a3ca240fac9 req-bdc102ba-96af-49d3-960d-a2abd1c345ec dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-4d147881-3a87-455b-8fa9-c0e3091974fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.667 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Acquired lock "refresh_cache-4d147881-3a87-455b-8fa9-c0e3091974fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.667 182959 DEBUG nova.network.neutron [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.762 182959 INFO nova.scheduler.client.report [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Deleted allocations for instance 88b81871-e30c-47d0-972b-3c5ec68db2ca#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.809 182959 DEBUG oslo_concurrency.lockutils [None req-ff7fa623-4ff0-4330-84eb-674eb4a9ed54 4f469d29ddd6455299c7fb0220c1ffcc 69532d75aefe4fa6ada76bf1c1d1da9b - - default default] Lock "88b81871-e30c-47d0-972b-3c5ec68db2ca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:25 np0005601978 nova_compute[182955]: 2026-01-30 09:23:25.851 182959 DEBUG nova.network.neutron [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:27 np0005601978 nova_compute[182955]: 2026-01-30 09:23:27.029 182959 DEBUG nova.network.neutron [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:27 np0005601978 nova_compute[182955]: 2026-01-30 09:23:27.048 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Releasing lock "refresh_cache-4d147881-3a87-455b-8fa9-c0e3091974fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:23:27 np0005601978 nova_compute[182955]: 2026-01-30 09:23:27.049 182959 DEBUG nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012#033[00m
Jan 30 04:23:27 np0005601978 nova_compute[182955]: 2026-01-30 09:23:27.049 182959 DEBUG nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:23:27 np0005601978 nova_compute[182955]: 2026-01-30 09:23:27.050 182959 DEBUG nova.network.neutron [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:23:27 np0005601978 nova_compute[182955]: 2026-01-30 09:23:27.196 182959 DEBUG nova.network.neutron [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:23:27 np0005601978 nova_compute[182955]: 2026-01-30 09:23:27.214 182959 DEBUG nova.network.neutron [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:23:27 np0005601978 nova_compute[182955]: 2026-01-30 09:23:27.230 182959 INFO nova.compute.manager [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] [instance: 4d147881-3a87-455b-8fa9-c0e3091974fb] Took 0.18 seconds to deallocate network for instance.#033[00m
Jan 30 04:23:27 np0005601978 nova_compute[182955]: 2026-01-30 09:23:27.375 182959 INFO nova.scheduler.client.report [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Deleted allocations for instance 4d147881-3a87-455b-8fa9-c0e3091974fb#033[00m
Jan 30 04:23:27 np0005601978 nova_compute[182955]: 2026-01-30 09:23:27.429 182959 DEBUG oslo_concurrency.lockutils [None req-5c52fcf0-e580-46c2-ace7-1c6ee5c82d3b a6dc933f15eb45099f724c1eb5d822e4 33bba0bc2a744596b558c6598a1970de - - default default] Lock "4d147881-3a87-455b-8fa9-c0e3091974fb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:23:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:29.473 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:23:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:29.474 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:23:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:29.474 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:23:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:29.475 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:23:31 np0005601978 podman[211848]: 2026-01-30 09:23:31.415509767 +0000 UTC m=+0.073990287 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=ubi9/ubi-minimal)
Jan 30 04:23:33 np0005601978 podman[211870]: 2026-01-30 09:23:33.3895441 +0000 UTC m=+0.053692557 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 30 04:23:40 np0005601978 podman[211888]: 2026-01-30 09:23:40.419596824 +0000 UTC m=+0.070183555 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:23:43 np0005601978 podman[211914]: 2026-01-30 09:23:43.407750658 +0000 UTC m=+0.065346189 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 30 04:23:47 np0005601978 ovn_controller[95419]: 2026-01-30T09:23:47Z|00098|chassis|WARN|Dropped 1 log messages in last 31 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:23:47 np0005601978 ovn_controller[95419]: 2026-01-30T09:23:47Z|00099|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:23:47 np0005601978 podman[211933]: 2026-01-30 09:23:47.428758665 +0000 UTC m=+0.081599681 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 30 04:23:51 np0005601978 podman[211959]: 2026-01-30 09:23:51.388195595 +0000 UTC m=+0.052290034 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:23:53 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:53.527 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:23:53 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:53.534 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:23:53 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:53.538 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:23:53 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:53.539 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:23:55 np0005601978 nova_compute[182955]: 2026-01-30 09:23:55.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:23:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:23:56 np0005601978 nova_compute[182955]: 2026-01-30 09:23:56.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:23:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:57.329 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:23:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:57.330 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:23:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:23:57.330 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:24:01Z|00100|chassis|WARN|Dropped 1 log messages in last 14 seconds (most recently, 14 seconds ago) due to excessive rate
Jan 30 04:24:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:24:01Z|00101|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:24:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:01.540 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:24:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:01.541 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:24:02 np0005601978 podman[211983]: 2026-01-30 09:24:02.411560137 +0000 UTC m=+0.076355616 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.7, architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter)
Jan 30 04:24:04 np0005601978 podman[212005]: 2026-01-30 09:24:04.416956266 +0000 UTC m=+0.084341628 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:24:11 np0005601978 podman[212025]: 2026-01-30 09:24:11.387594407 +0000 UTC m=+0.050619173 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:24:14 np0005601978 podman[212050]: 2026-01-30 09:24:14.395761033 +0000 UTC m=+0.056171945 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 30 04:24:18 np0005601978 ovn_controller[95419]: 2026-01-30T09:24:18Z|00102|chassis|WARN|Dropped 29 log messages in last 15 seconds (most recently, 9 seconds ago) due to excessive rate
Jan 30 04:24:18 np0005601978 ovn_controller[95419]: 2026-01-30T09:24:18Z|00103|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:24:18 np0005601978 podman[212069]: 2026-01-30 09:24:18.422818137 +0000 UTC m=+0.087642401 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:24:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:19.507 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:24:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:19.507 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:24:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:19.508 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:24:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:19.509 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:24:20 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:20.514 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:24:20 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:20.515 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:24:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:21.516 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:24:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:21.516 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:24:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:21.516 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:24:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:21.517 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:24:22 np0005601978 podman[212095]: 2026-01-30 09:24:22.398827809 +0000 UTC m=+0.058840588 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:24:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:23.520 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:24:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:23.521 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:24:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:25.522 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:24:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:25.522 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:24:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:25.524 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:24:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:25.524 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:24:26 np0005601978 nova_compute[182955]: 2026-01-30 09:24:26.218 182959 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [SYS] unknown error (_ssl.c:2501)#033[00m
Jan 30 04:24:27 np0005601978 nova_compute[182955]: 2026-01-30 09:24:27.308 182959 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [SYS] unknown error (_ssl.c:2501)#033[00m
Jan 30 04:24:27 np0005601978 nova_compute[182955]: 2026-01-30 09:24:27.310 182959 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [SYS] unknown error (_ssl.c:2501)#033[00m
Jan 30 04:24:28 np0005601978 nova_compute[182955]: 2026-01-30 09:24:28.819 182959 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [SYS] unknown error (_ssl.c:2501)#033[00m
Jan 30 04:24:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:29.528 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:24:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:29.531 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:24:33 np0005601978 podman[212119]: 2026-01-30 09:24:33.373996154 +0000 UTC m=+0.035728420 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.7, build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter)
Jan 30 04:24:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:33.531 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:24:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:33.531 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:24:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:33.536 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:24:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:33.536 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:24:35 np0005601978 podman[212141]: 2026-01-30 09:24:35.381395851 +0000 UTC m=+0.047048869 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 30 04:24:40 np0005601978 nova_compute[182955]: 2026-01-30 09:24:40.805 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 42.93 sec#033[00m
Jan 30 04:24:40 np0005601978 nova_compute[182955]: 2026-01-30 09:24:40.807 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:24:40 np0005601978 nova_compute[182955]: 2026-01-30 09:24:40.807 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:24:40 np0005601978 nova_compute[182955]: 2026-01-30 09:24:40.807 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:40 np0005601978 nova_compute[182955]: 2026-01-30 09:24:40.807 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:24:40 np0005601978 nova_compute[182955]: 2026-01-30 09:24:40.945 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:24:40 np0005601978 nova_compute[182955]: 2026-01-30 09:24:40.946 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6030MB free_disk=73.36343383789062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:24:40 np0005601978 nova_compute[182955]: 2026-01-30 09:24:40.947 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:24:40 np0005601978 nova_compute[182955]: 2026-01-30 09:24:40.947 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:24:41 np0005601978 nova_compute[182955]: 2026-01-30 09:24:41.301 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:24:41 np0005601978 nova_compute[182955]: 2026-01-30 09:24:41.302 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:24:41 np0005601978 nova_compute[182955]: 2026-01-30 09:24:41.324 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:24:41 np0005601978 nova_compute[182955]: 2026-01-30 09:24:41.348 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:24:41 np0005601978 nova_compute[182955]: 2026-01-30 09:24:41.378 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:24:41 np0005601978 nova_compute[182955]: 2026-01-30 09:24:41.379 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.432s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:42 np0005601978 podman[212160]: 2026-01-30 09:24:42.39809262 +0000 UTC m=+0.056651407 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:24:43 np0005601978 nova_compute[182955]: 2026-01-30 09:24:43.380 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:43 np0005601978 nova_compute[182955]: 2026-01-30 09:24:43.381 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:43 np0005601978 nova_compute[182955]: 2026-01-30 09:24:43.382 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:24:43 np0005601978 nova_compute[182955]: 2026-01-30 09:24:43.382 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:24:43 np0005601978 nova_compute[182955]: 2026-01-30 09:24:43.399 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:24:43 np0005601978 nova_compute[182955]: 2026-01-30 09:24:43.399 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:43 np0005601978 nova_compute[182955]: 2026-01-30 09:24:43.400 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:43 np0005601978 nova_compute[182955]: 2026-01-30 09:24:43.400 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:43 np0005601978 nova_compute[182955]: 2026-01-30 09:24:43.401 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:43 np0005601978 nova_compute[182955]: 2026-01-30 09:24:43.401 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:43 np0005601978 nova_compute[182955]: 2026-01-30 09:24:43.402 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:24:45 np0005601978 podman[212185]: 2026-01-30 09:24:45.390390428 +0000 UTC m=+0.054843664 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:24:49 np0005601978 ovn_controller[95419]: 2026-01-30T09:24:49Z|00104|chassis|WARN|Dropped 1 log messages in last 31 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:24:49 np0005601978 ovn_controller[95419]: 2026-01-30T09:24:49Z|00105|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:24:49 np0005601978 podman[212204]: 2026-01-30 09:24:49.435560922 +0000 UTC m=+0.093704426 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible)
Jan 30 04:24:53 np0005601978 podman[212231]: 2026-01-30 09:24:53.398508954 +0000 UTC m=+0.058282155 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:24:53 np0005601978 nova_compute[182955]: 2026-01-30 09:24:53.450 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:56 np0005601978 nova_compute[182955]: 2026-01-30 09:24:56.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:57.332 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:24:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:57.333 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:24:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:57.333 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:57 np0005601978 nova_compute[182955]: 2026-01-30 09:24:57.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:24:57 np0005601978 nova_compute[182955]: 2026-01-30 09:24:57.457 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:24:57 np0005601978 nova_compute[182955]: 2026-01-30 09:24:57.457 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:24:57 np0005601978 nova_compute[182955]: 2026-01-30 09:24:57.458 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:57 np0005601978 nova_compute[182955]: 2026-01-30 09:24:57.458 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:24:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:57.550 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:24:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:57.557 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:24:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:57.558 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:24:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:57.558 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:24:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:57.559 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:24:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:24:57.571 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:24:57 np0005601978 nova_compute[182955]: 2026-01-30 09:24:57.636 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:24:57 np0005601978 nova_compute[182955]: 2026-01-30 09:24:57.637 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6033MB free_disk=73.36345291137695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:24:57 np0005601978 nova_compute[182955]: 2026-01-30 09:24:57.638 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:24:57 np0005601978 nova_compute[182955]: 2026-01-30 09:24:57.638 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:24:57 np0005601978 nova_compute[182955]: 2026-01-30 09:24:57.689 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:24:57 np0005601978 nova_compute[182955]: 2026-01-30 09:24:57.689 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:24:57 np0005601978 nova_compute[182955]: 2026-01-30 09:24:57.712 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:24:57 np0005601978 nova_compute[182955]: 2026-01-30 09:24:57.725 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:24:57 np0005601978 nova_compute[182955]: 2026-01-30 09:24:57.726 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:24:57 np0005601978 nova_compute[182955]: 2026-01-30 09:24:57.727 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:24:59 np0005601978 nova_compute[182955]: 2026-01-30 09:24:59.728 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:00 np0005601978 nova_compute[182955]: 2026-01-30 09:25:00.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:00 np0005601978 nova_compute[182955]: 2026-01-30 09:25:00.435 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:25:00 np0005601978 nova_compute[182955]: 2026-01-30 09:25:00.435 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:25:00 np0005601978 nova_compute[182955]: 2026-01-30 09:25:00.448 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:25:00 np0005601978 nova_compute[182955]: 2026-01-30 09:25:00.448 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:00 np0005601978 nova_compute[182955]: 2026-01-30 09:25:00.448 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:01 np0005601978 nova_compute[182955]: 2026-01-30 09:25:01.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:02 np0005601978 nova_compute[182955]: 2026-01-30 09:25:02.429 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:03 np0005601978 nova_compute[182955]: 2026-01-30 09:25:03.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:03 np0005601978 nova_compute[182955]: 2026-01-30 09:25:03.433 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:25:04 np0005601978 podman[212255]: 2026-01-30 09:25:04.397122156 +0000 UTC m=+0.056297308 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, com.redhat.component=ubi9-minimal-container, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., vcs-type=git, version=9.7, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 30 04:25:06 np0005601978 podman[212277]: 2026-01-30 09:25:06.395511339 +0000 UTC m=+0.056503713 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:25:06 np0005601978 ovn_controller[95419]: 2026-01-30T09:25:06Z|00106|chassis|WARN|Dropped 2 log messages in last 18 seconds (most recently, 6 seconds ago) due to excessive rate
Jan 30 04:25:06 np0005601978 ovn_controller[95419]: 2026-01-30T09:25:06Z|00107|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:25:13 np0005601978 podman[212297]: 2026-01-30 09:25:13.402129318 +0000 UTC m=+0.067597513 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:25:16 np0005601978 podman[212321]: 2026-01-30 09:25:16.383367597 +0000 UTC m=+0.046192463 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 30 04:25:20 np0005601978 ovn_controller[95419]: 2026-01-30T09:25:20Z|00108|chassis|WARN|Dropped 4 log messages in last 13 seconds (most recently, 13 seconds ago) due to excessive rate
Jan 30 04:25:20 np0005601978 ovn_controller[95419]: 2026-01-30T09:25:20Z|00109|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:25:20 np0005601978 podman[212340]: 2026-01-30 09:25:20.457433958 +0000 UTC m=+0.106264437 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:25:24 np0005601978 podman[212366]: 2026-01-30 09:25:24.386272813 +0000 UTC m=+0.045348023 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:25:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:25:27.923 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:25:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:25:27.923 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:25:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:25:27.925 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:25:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:25:27.925 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:25:35 np0005601978 podman[212391]: 2026-01-30 09:25:35.425723179 +0000 UTC m=+0.089203919 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, release=1769056855)
Jan 30 04:25:37 np0005601978 podman[212412]: 2026-01-30 09:25:37.417744676 +0000 UTC m=+0.069345754 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 30 04:25:44 np0005601978 podman[212433]: 2026-01-30 09:25:44.408123116 +0000 UTC m=+0.066677571 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:25:47 np0005601978 podman[212457]: 2026-01-30 09:25:47.387691194 +0000 UTC m=+0.050979054 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:25:51 np0005601978 ovn_controller[95419]: 2026-01-30T09:25:51Z|00110|chassis|WARN|Dropped 1 log messages in last 31 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:25:51 np0005601978 ovn_controller[95419]: 2026-01-30T09:25:51Z|00111|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:25:51 np0005601978 podman[212476]: 2026-01-30 09:25:51.467335526 +0000 UTC m=+0.118738019 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:25:51 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:25:51.958 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:25:51 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:25:51.962 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:25:51 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:25:51.971 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:25:51 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:25:51.973 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:25:54 np0005601978 nova_compute[182955]: 2026-01-30 09:25:54.281 182959 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [SYS] unknown error (_ssl.c:2501)#033[00m
Jan 30 04:25:54 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:25:54.975 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:25:54 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:25:54.976 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:25:55 np0005601978 podman[212502]: 2026-01-30 09:25:55.39232688 +0000 UTC m=+0.049531880 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.751 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:25:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:25:56 np0005601978 nova_compute[182955]: 2026-01-30 09:25:56.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:25:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:25:57.333 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:25:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:25:57.334 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:25:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:25:57.334 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:25:59 np0005601978 nova_compute[182955]: 2026-01-30 09:25:59.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:03 np0005601978 nova_compute[182955]: 2026-01-30 09:26:03.751 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:03 np0005601978 nova_compute[182955]: 2026-01-30 09:26:03.751 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:03 np0005601978 nova_compute[182955]: 2026-01-30 09:26:03.752 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:03 np0005601978 nova_compute[182955]: 2026-01-30 09:26:03.752 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:26:03 np0005601978 nova_compute[182955]: 2026-01-30 09:26:03.899 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:26:03 np0005601978 nova_compute[182955]: 2026-01-30 09:26:03.900 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6025MB free_disk=73.36314010620117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:26:03 np0005601978 nova_compute[182955]: 2026-01-30 09:26:03.900 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:03 np0005601978 nova_compute[182955]: 2026-01-30 09:26:03.901 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:04 np0005601978 nova_compute[182955]: 2026-01-30 09:26:04.018 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:26:04 np0005601978 nova_compute[182955]: 2026-01-30 09:26:04.019 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:26:04 np0005601978 nova_compute[182955]: 2026-01-30 09:26:04.039 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:26:04 np0005601978 nova_compute[182955]: 2026-01-30 09:26:04.061 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:26:04 np0005601978 nova_compute[182955]: 2026-01-30 09:26:04.064 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:26:04 np0005601978 nova_compute[182955]: 2026-01-30 09:26:04.064 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:05 np0005601978 nova_compute[182955]: 2026-01-30 09:26:05.065 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:05 np0005601978 nova_compute[182955]: 2026-01-30 09:26:05.066 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:05 np0005601978 nova_compute[182955]: 2026-01-30 09:26:05.066 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:26:05 np0005601978 nova_compute[182955]: 2026-01-30 09:26:05.066 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:26:05 np0005601978 nova_compute[182955]: 2026-01-30 09:26:05.090 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:26:05 np0005601978 nova_compute[182955]: 2026-01-30 09:26:05.091 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:05 np0005601978 nova_compute[182955]: 2026-01-30 09:26:05.091 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:05 np0005601978 nova_compute[182955]: 2026-01-30 09:26:05.092 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:05 np0005601978 nova_compute[182955]: 2026-01-30 09:26:05.092 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:05 np0005601978 nova_compute[182955]: 2026-01-30 09:26:05.093 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:05 np0005601978 nova_compute[182955]: 2026-01-30 09:26:05.093 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:26:06 np0005601978 nova_compute[182955]: 2026-01-30 09:26:06.308 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 45.50 sec#033[00m
Jan 30 04:26:06 np0005601978 podman[212526]: 2026-01-30 09:26:06.419385027 +0000 UTC m=+0.072331373 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Jan 30 04:26:06 np0005601978 ovn_controller[95419]: 2026-01-30T09:26:06Z|00112|chassis|WARN|Dropped 2 log messages in last 16 seconds (most recently, 6 seconds ago) due to excessive rate
Jan 30 04:26:06 np0005601978 ovn_controller[95419]: 2026-01-30T09:26:06Z|00113|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:26:08 np0005601978 podman[212547]: 2026-01-30 09:26:08.395639255 +0000 UTC m=+0.056557205 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute)
Jan 30 04:26:15 np0005601978 podman[212568]: 2026-01-30 09:26:15.416999789 +0000 UTC m=+0.079714786 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.037 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.037 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:26:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:17.038 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:26:18 np0005601978 podman[212594]: 2026-01-30 09:26:18.397696535 +0000 UTC m=+0.055031929 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:26:22 np0005601978 ovn_controller[95419]: 2026-01-30T09:26:22Z|00114|chassis|WARN|Dropped 28 log messages in last 15 seconds (most recently, 8 seconds ago) due to excessive rate
Jan 30 04:26:22 np0005601978 ovn_controller[95419]: 2026-01-30T09:26:22Z|00115|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:26:22 np0005601978 podman[212613]: 2026-01-30 09:26:22.428217118 +0000 UTC m=+0.086800143 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:26:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:24.562 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:26:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:24.562 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:26:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:24.564 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:26:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:24.565 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:26:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:25.577 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:26:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:25.578 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:26:26 np0005601978 podman[212640]: 2026-01-30 09:26:26.385557048 +0000 UTC m=+0.041393049 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:26:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:26.578 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:26:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:26.579 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:26:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:26.580 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:26:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:26.580 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:26:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:28.584 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:26:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:28.586 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:26:30 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:30.587 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:26:30 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:30.587 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:26:30 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:30.589 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:26:30 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:30.589 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:26:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:34.596 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:26:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:34.597 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:26:37 np0005601978 podman[212664]: 2026-01-30 09:26:37.429700625 +0000 UTC m=+0.082771488 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, vcs-type=git, version=9.7, architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, distribution-scope=public, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter)
Jan 30 04:26:38 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:38.598 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:26:38 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:38.599 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:26:38 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:38.602 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:26:38 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:38.602 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:26:39 np0005601978 podman[212685]: 2026-01-30 09:26:39.400110917 +0000 UTC m=+0.059870921 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:26:46 np0005601978 podman[212706]: 2026-01-30 09:26:46.4187458 +0000 UTC m=+0.069535948 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:26:49 np0005601978 podman[212732]: 2026-01-30 09:26:49.414288493 +0000 UTC m=+0.075264523 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:26:53 np0005601978 ovn_controller[95419]: 2026-01-30T09:26:53Z|00116|chassis|WARN|Dropped 1 log messages in last 31 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:26:53 np0005601978 ovn_controller[95419]: 2026-01-30T09:26:53Z|00117|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:26:53 np0005601978 podman[212749]: 2026-01-30 09:26:53.427384448 +0000 UTC m=+0.085332498 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 30 04:26:53 np0005601978 nova_compute[182955]: 2026-01-30 09:26:53.456 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:26:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:57.334 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:26:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:57.334 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:26:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:26:57.335 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:26:57 np0005601978 podman[212775]: 2026-01-30 09:26:57.403164269 +0000 UTC m=+0.061598272 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:27:08 np0005601978 podman[212799]: 2026-01-30 09:27:08.421746409 +0000 UTC m=+0.075674472 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.component=ubi9-minimal-container, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 30 04:27:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:09.755 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:27:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:09.757 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:27:10 np0005601978 podman[212822]: 2026-01-30 09:27:10.410965902 +0000 UTC m=+0.070291556 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 30 04:27:14 np0005601978 ovn_controller[95419]: 2026-01-30T09:27:14Z|00118|chassis|WARN|Dropped 3 log messages in last 21 seconds (most recently, 13 seconds ago) due to excessive rate
Jan 30 04:27:14 np0005601978 ovn_controller[95419]: 2026-01-30T09:27:14Z|00119|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:27:17 np0005601978 podman[212843]: 2026-01-30 09:27:17.411558472 +0000 UTC m=+0.070351567 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.015 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.015 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.016 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.031 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.032 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.032 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.032 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.032 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.032 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.032 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.033 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.033 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.051 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.051 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.051 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.051 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.196 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.197 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6030MB free_disk=73.36314010620117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.197 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.197 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.375 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.376 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.406 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.418 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.420 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.421 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.421 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.422 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.464 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.464 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.464 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 30 04:27:19 np0005601978 nova_compute[182955]: 2026-01-30 09:27:19.479 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:20 np0005601978 podman[212868]: 2026-01-30 09:27:20.424763541 +0000 UTC m=+0.069354339 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:27:21 np0005601978 nova_compute[182955]: 2026-01-30 09:27:21.572 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 45.27 sec#033[00m
Jan 30 04:27:24 np0005601978 podman[212887]: 2026-01-30 09:27:24.441131879 +0000 UTC m=+0.100180761 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 30 04:27:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:27.381 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:27:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:27.382 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:27:28 np0005601978 podman[212913]: 2026-01-30 09:27:28.416534086 +0000 UTC m=+0.070627201 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:27:28 np0005601978 nova_compute[182955]: 2026-01-30 09:27:28.460 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:34.386 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:27:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:34.387 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:27:39 np0005601978 podman[212937]: 2026-01-30 09:27:39.439049103 +0000 UTC m=+0.094096838 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.7)
Jan 30 04:27:41 np0005601978 podman[212959]: 2026-01-30 09:27:41.398056944 +0000 UTC m=+0.058177553 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 30 04:27:42 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:42.835 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:27:42 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:42.835 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:27:42 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:42.837 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:27:42 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:42.837 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:27:43 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:43.845 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:27:43 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:43.846 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:27:44 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:44.847 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:27:44 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:44.847 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:27:44 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:44.847 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:27:44 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:44.848 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:27:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:46.853 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:27:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:46.854 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:27:48 np0005601978 podman[212980]: 2026-01-30 09:27:48.417435718 +0000 UTC m=+0.067757912 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:27:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:48.855 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:27:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:48.855 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:27:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:48.856 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:27:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:48.856 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:27:51 np0005601978 podman[213004]: 2026-01-30 09:27:51.413837785 +0000 UTC m=+0.066470091 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:27:52 np0005601978 nova_compute[182955]: 2026-01-30 09:27:52.372 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:27:52 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:52.860 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:27:52 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:52.861 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:27:55 np0005601978 ovn_controller[95419]: 2026-01-30T09:27:55Z|00120|chassis|WARN|Dropped 3 log messages in last 41 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:27:55 np0005601978 ovn_controller[95419]: 2026-01-30T09:27:55Z|00121|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:27:55 np0005601978 podman[213024]: 2026-01-30 09:27:55.441598715 +0000 UTC m=+0.097587150 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:27:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:27:56 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:56.862 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:27:56 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:56.862 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:27:56 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:56.865 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:27:56 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:56.866 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:27:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:57.337 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:27:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:57.337 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:27:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:27:57.338 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:27:59 np0005601978 podman[213050]: 2026-01-30 09:27:59.387239963 +0000 UTC m=+0.048407551 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:28:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:04.878 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:28:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:04.889 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:28:07 np0005601978 ovn_controller[95419]: 2026-01-30T09:28:07Z|00122|chassis|WARN|Dropped 37 log messages in last 12 seconds (most recently, 0 seconds ago) due to excessive rate
Jan 30 04:28:07 np0005601978 ovn_controller[95419]: 2026-01-30T09:28:07Z|00123|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:28:10 np0005601978 podman[213075]: 2026-01-30 09:28:10.392661725 +0000 UTC m=+0.055631343 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, version=9.7, container_name=openstack_network_exporter, distribution-scope=public, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 30 04:28:12 np0005601978 podman[213096]: 2026-01-30 09:28:12.418517396 +0000 UTC m=+0.072963427 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 30 04:28:19 np0005601978 podman[213116]: 2026-01-30 09:28:19.399451545 +0000 UTC m=+0.054342501 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:28:20 np0005601978 ovn_controller[95419]: 2026-01-30T09:28:20Z|00124|chassis|WARN|Dropped 30 log messages in last 13 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 30 04:28:20 np0005601978 ovn_controller[95419]: 2026-01-30T09:28:20Z|00125|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:28:22 np0005601978 podman[213140]: 2026-01-30 09:28:22.416194256 +0000 UTC m=+0.063523641 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 30 04:28:26 np0005601978 podman[213161]: 2026-01-30 09:28:26.419651937 +0000 UTC m=+0.082524182 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:28:30 np0005601978 podman[213187]: 2026-01-30 09:28:30.398032815 +0000 UTC m=+0.053211276 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:28:35 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:35.506 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:28:35 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:35.506 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:28:35 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:35.508 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:28:35 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:35.508 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:28:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:36.521 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:28:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:36.521 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:28:36 np0005601978 nova_compute[182955]: 2026-01-30 09:28:36.836 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 45.26 sec#033[00m
Jan 30 04:28:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:37.523 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:28:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:37.523 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:28:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:37.523 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:28:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:37.523 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:28:39 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:39.529 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:28:39 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:39.529 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:28:41 np0005601978 podman[213212]: 2026-01-30 09:28:41.400101848 +0000 UTC m=+0.049294564 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, release=1769056855, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter)
Jan 30 04:28:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:41.530 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:28:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:41.531 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:28:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:41.532 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:28:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:41.532 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:28:43 np0005601978 podman[213233]: 2026-01-30 09:28:43.391491128 +0000 UTC m=+0.051913615 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:28:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:45.543 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:28:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:45.544 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.126 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.126 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.126 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.157 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.157 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.158 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.158 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.328 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.329 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6034MB free_disk=73.36314010620117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.329 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.330 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:49 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:49.545 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:28:49 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:49.546 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:28:49 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:49.549 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:28:49 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:49.549 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.706 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.707 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.806 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing inventories for resource provider 5912bad0-7860-4f37-8078-1db5720295f4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.910 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Updating ProviderTree inventory for provider 5912bad0-7860-4f37-8078-1db5720295f4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.910 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Updating inventory in ProviderTree for provider 5912bad0-7860-4f37-8078-1db5720295f4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.931 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing aggregate associations for resource provider 5912bad0-7860-4f37-8078-1db5720295f4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.952 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing trait associations for resource provider 5912bad0-7860-4f37-8078-1db5720295f4, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 30 04:28:49 np0005601978 nova_compute[182955]: 2026-01-30 09:28:49.975 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:28:50 np0005601978 nova_compute[182955]: 2026-01-30 09:28:50.035 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:28:50 np0005601978 nova_compute[182955]: 2026-01-30 09:28:50.036 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:28:50 np0005601978 nova_compute[182955]: 2026-01-30 09:28:50.037 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:50 np0005601978 podman[213253]: 2026-01-30 09:28:50.400221367 +0000 UTC m=+0.057357054 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:28:53 np0005601978 podman[213277]: 2026-01-30 09:28:53.384125057 +0000 UTC m=+0.042938602 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 30 04:28:55 np0005601978 nova_compute[182955]: 2026-01-30 09:28:55.099 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:55 np0005601978 nova_compute[182955]: 2026-01-30 09:28:55.099 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:55 np0005601978 nova_compute[182955]: 2026-01-30 09:28:55.100 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:28:55 np0005601978 nova_compute[182955]: 2026-01-30 09:28:55.100 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:28:55 np0005601978 nova_compute[182955]: 2026-01-30 09:28:55.133 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:28:55 np0005601978 nova_compute[182955]: 2026-01-30 09:28:55.133 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:55 np0005601978 nova_compute[182955]: 2026-01-30 09:28:55.134 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:55 np0005601978 nova_compute[182955]: 2026-01-30 09:28:55.134 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:55 np0005601978 nova_compute[182955]: 2026-01-30 09:28:55.135 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:55 np0005601978 nova_compute[182955]: 2026-01-30 09:28:55.135 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:55 np0005601978 ovn_controller[95419]: 2026-01-30T09:28:55Z|00126|chassis|WARN|Dropped 17 log messages in last 36 seconds (most recently, 30 seconds ago) due to excessive rate
Jan 30 04:28:55 np0005601978 ovn_controller[95419]: 2026-01-30T09:28:55Z|00127|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:28:56 np0005601978 nova_compute[182955]: 2026-01-30 09:28:56.464 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:28:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:57.338 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:28:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:57.339 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:28:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:57.339 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:28:57 np0005601978 podman[213296]: 2026-01-30 09:28:57.412691415 +0000 UTC m=+0.072463263 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 30 04:28:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:57.553 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:28:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:57.564 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:28:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:57.564 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:28:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:28:57.569 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:29:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:00.567 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:00.567 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:29:01 np0005601978 podman[213323]: 2026-01-30 09:29:01.40322062 +0000 UTC m=+0.062067085 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:29:07 np0005601978 ovn_controller[95419]: 2026-01-30T09:29:07Z|00128|chassis|WARN|Dropped 55 log messages in last 12 seconds (most recently, 0 seconds ago) due to excessive rate
Jan 30 04:29:07 np0005601978 ovn_controller[95419]: 2026-01-30T09:29:07Z|00129|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:29:12 np0005601978 podman[213348]: 2026-01-30 09:29:12.376087177 +0000 UTC m=+0.040589395 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.7, architecture=x86_64, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Jan 30 04:29:14 np0005601978 podman[213367]: 2026-01-30 09:29:14.393714911 +0000 UTC m=+0.053656876 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute)
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.040 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:29:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:17.041 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:29:21 np0005601978 podman[213387]: 2026-01-30 09:29:21.383452389 +0000 UTC m=+0.046639729 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:29:21 np0005601978 ovn_controller[95419]: 2026-01-30T09:29:21Z|00130|chassis|WARN|Dropped 44 log messages in last 13 seconds (most recently, 3 seconds ago) due to excessive rate
Jan 30 04:29:21 np0005601978 ovn_controller[95419]: 2026-01-30T09:29:21Z|00131|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:29:24 np0005601978 podman[213411]: 2026-01-30 09:29:24.382568002 +0000 UTC m=+0.048307299 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:29:28 np0005601978 podman[213430]: 2026-01-30 09:29:28.419242204 +0000 UTC m=+0.078142458 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 30 04:29:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:31.890 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:29:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:31.891 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:29:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:31.892 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:29:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:31.892 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:29:31 np0005601978 podman[213457]: 2026-01-30 09:29:31.996754596 +0000 UTC m=+0.078209993 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:29:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:32.899 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:29:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:32.900 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:29:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:33.900 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:29:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:33.901 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:29:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:33.902 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:29:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:33.902 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:29:35 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:35.907 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:29:35 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:35.907 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:29:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:37.909 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:29:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:37.909 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:29:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:37.910 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:29:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:37.911 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:29:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:41.913 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:29:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:41.923 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:29:43 np0005601978 podman[213482]: 2026-01-30 09:29:43.38638328 +0000 UTC m=+0.051484004 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, distribution-scope=public, maintainer=Red Hat, Inc., release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 30 04:29:45 np0005601978 podman[213505]: 2026-01-30 09:29:45.432632192 +0000 UTC m=+0.091743443 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 30 04:29:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:45.917 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:29:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:45.917 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:29:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:45.929 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:29:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:45.929 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:29:49 np0005601978 ovn_controller[95419]: 2026-01-30T09:29:49Z|00132|chassis|WARN|Dropped 3 log messages in last 28 seconds (most recently, 21 seconds ago) due to excessive rate
Jan 30 04:29:49 np0005601978 ovn_controller[95419]: 2026-01-30T09:29:49Z|00133|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.102 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 45.26 sec#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.104 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.104 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.104 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.167 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.167 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.168 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.168 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.168 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.169 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.169 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.169 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.169 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.211 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.211 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.211 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.211 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.410 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.412 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6030MB free_disk=73.36312103271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.412 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.412 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:52 np0005601978 podman[213525]: 2026-01-30 09:29:52.426735872 +0000 UTC m=+0.081315777 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.494 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.495 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.521 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.540 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.542 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:29:52 np0005601978 nova_compute[182955]: 2026-01-30 09:29:52.543 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:53 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:53.928 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:29:53 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:53.938 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:29:53 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:53.939 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:29:53 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:53.943 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:29:55 np0005601978 podman[213549]: 2026-01-30 09:29:55.428239801 +0000 UTC m=+0.088776793 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:29:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:29:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:57.339 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:29:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:57.340 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:29:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:29:57.340 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:29:59 np0005601978 podman[213568]: 2026-01-30 09:29:59.479913232 +0000 UTC m=+0.134799907 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:30:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:30:01Z|00134|chassis|WARN|Dropped 34 log messages in last 12 seconds (most recently, 0 seconds ago) due to excessive rate
Jan 30 04:30:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:30:01Z|00135|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:30:02 np0005601978 podman[213594]: 2026-01-30 09:30:02.399452791 +0000 UTC m=+0.057739323 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:30:02 np0005601978 nova_compute[182955]: 2026-01-30 09:30:02.507 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:02 np0005601978 nova_compute[182955]: 2026-01-30 09:30:02.508 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:02.940 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:02.941 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.572 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.572 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.572 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.572 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.762 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.763 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6027MB free_disk=73.36314010620117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.763 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.763 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.912 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.912 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.941 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.956 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.958 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:30:03 np0005601978 nova_compute[182955]: 2026-01-30 09:30:03.958 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:04 np0005601978 nova_compute[182955]: 2026-01-30 09:30:04.959 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:04 np0005601978 nova_compute[182955]: 2026-01-30 09:30:04.959 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:30:04 np0005601978 nova_compute[182955]: 2026-01-30 09:30:04.960 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:30:04 np0005601978 nova_compute[182955]: 2026-01-30 09:30:04.978 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:30:06 np0005601978 nova_compute[182955]: 2026-01-30 09:30:06.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:06 np0005601978 nova_compute[182955]: 2026-01-30 09:30:06.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:06 np0005601978 nova_compute[182955]: 2026-01-30 09:30:06.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:07 np0005601978 nova_compute[182955]: 2026-01-30 09:30:07.430 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:09 np0005601978 nova_compute[182955]: 2026-01-30 09:30:09.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:09 np0005601978 nova_compute[182955]: 2026-01-30 09:30:09.433 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:30:13 np0005601978 ovn_controller[95419]: 2026-01-30T09:30:13Z|00136|chassis|WARN|Dropped 35 log messages in last 12 seconds (most recently, 2 seconds ago) due to excessive rate
Jan 30 04:30:13 np0005601978 ovn_controller[95419]: 2026-01-30T09:30:13Z|00137|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:30:14 np0005601978 podman[213619]: 2026-01-30 09:30:14.394159787 +0000 UTC m=+0.055166260 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, release=1769056855, config_id=openstack_network_exporter, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 30 04:30:16 np0005601978 podman[213640]: 2026-01-30 09:30:16.402790603 +0000 UTC m=+0.062699478 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:30:23 np0005601978 podman[213661]: 2026-01-30 09:30:23.415325167 +0000 UTC m=+0.069565820 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:30:26 np0005601978 ovn_controller[95419]: 2026-01-30T09:30:26Z|00138|chassis|WARN|Dropped 35 log messages in last 12 seconds (most recently, 0 seconds ago) due to excessive rate
Jan 30 04:30:26 np0005601978 ovn_controller[95419]: 2026-01-30T09:30:26Z|00139|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:30:26 np0005601978 podman[213686]: 2026-01-30 09:30:26.400929841 +0000 UTC m=+0.059814360 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 30 04:30:30 np0005601978 podman[213705]: 2026-01-30 09:30:30.422259308 +0000 UTC m=+0.081435269 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 30 04:30:33 np0005601978 podman[213730]: 2026-01-30 09:30:33.405656122 +0000 UTC m=+0.057149368 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:30:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:37.554 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:30:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:37.555 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:30:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:37.556 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:30:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:37.557 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:30:38 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:38.563 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:30:38 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:38.563 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:30:39 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:39.564 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:30:39 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:39.565 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:30:39 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:39.565 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:30:39 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:39.566 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:30:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:41.573 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:30:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:41.573 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:30:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:41.581 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:30:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:41.584 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:30:41 np0005601978 ovn_controller[95419]: 2026-01-30T09:30:41Z|00140|chassis|WARN|Dropped 9 log messages in last 16 seconds (most recently, 12 seconds ago) due to excessive rate
Jan 30 04:30:41 np0005601978 ovn_controller[95419]: 2026-01-30T09:30:41Z|00141|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:30:42 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:42.892 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:30:42 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:42.893 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:30:45 np0005601978 podman[213754]: 2026-01-30 09:30:45.403604257 +0000 UTC m=+0.066271473 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, version=9.7, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1769056855, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 30 04:30:47 np0005601978 podman[213775]: 2026-01-30 09:30:47.418574912 +0000 UTC m=+0.074999558 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 30 04:30:52 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:52.896 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:30:52 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:52.896 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:30:53 np0005601978 ovn_controller[95419]: 2026-01-30T09:30:53Z|00142|chassis|WARN|Dropped 34 log messages in last 12 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 30 04:30:53 np0005601978 ovn_controller[95419]: 2026-01-30T09:30:53Z|00143|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:30:54 np0005601978 podman[213797]: 2026-01-30 09:30:54.400363099 +0000 UTC m=+0.059769779 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:30:56 np0005601978 nova_compute[182955]: 2026-01-30 09:30:56.429 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:30:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:57.340 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:30:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:57.340 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:30:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:30:57.340 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:30:57 np0005601978 podman[213823]: 2026-01-30 09:30:57.388233257 +0000 UTC m=+0.049994108 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:31:01 np0005601978 podman[213841]: 2026-01-30 09:31:01.447411636 +0000 UTC m=+0.105785573 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 30 04:31:04 np0005601978 podman[213868]: 2026-01-30 09:31:04.416723247 +0000 UTC m=+0.069993381 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:31:06 np0005601978 ovn_controller[95419]: 2026-01-30T09:31:06Z|00144|chassis|WARN|Dropped 22 log messages in last 12 seconds (most recently, 3 seconds ago) due to excessive rate
Jan 30 04:31:06 np0005601978 ovn_controller[95419]: 2026-01-30T09:31:06Z|00145|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.046 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 42.95 sec#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.049 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.050 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.050 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.064 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.065 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.065 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.065 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.066 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.066 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.066 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.066 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.067 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.105 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.106 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.106 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.107 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.546 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.548 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6026MB free_disk=73.36314010620117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.548 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.548 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.630 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.630 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.661 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.689 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.692 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:31:15 np0005601978 nova_compute[182955]: 2026-01-30 09:31:15.692 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:16 np0005601978 podman[213893]: 2026-01-30 09:31:16.419885376 +0000 UTC m=+0.074948047 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, architecture=x86_64, io.buildah.version=1.33.7)
Jan 30 04:31:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:17.106 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:31:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:17.107 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:31:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:17.108 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:31:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:17.109 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:31:18 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:18.114 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:31:18 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:18.114 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:31:18 np0005601978 podman[213916]: 2026-01-30 09:31:18.411608313 +0000 UTC m=+0.064656785 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute)
Jan 30 04:31:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:19.115 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:31:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:19.115 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:31:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:19.115 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:31:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:19.115 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:31:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:21.121 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:31:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:21.122 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:31:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:23.123 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:31:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:23.123 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:31:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:23.123 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:31:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:23.124 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:31:25 np0005601978 podman[213936]: 2026-01-30 09:31:25.384147803 +0000 UTC m=+0.046432435 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:31:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:27.128 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:31:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:27.129 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:31:28 np0005601978 podman[213960]: 2026-01-30 09:31:28.428604704 +0000 UTC m=+0.088219009 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 30 04:31:28 np0005601978 nova_compute[182955]: 2026-01-30 09:31:28.693 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:31.132 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:31:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:31.133 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:31:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:31.133 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:31:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:31.133 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:31:32 np0005601978 ovn_controller[95419]: 2026-01-30T09:31:32Z|00146|chassis|WARN|Dropped 3 log messages in last 26 seconds (most recently, 25 seconds ago) due to excessive rate
Jan 30 04:31:32 np0005601978 ovn_controller[95419]: 2026-01-30T09:31:32Z|00147|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:31:32 np0005601978 podman[213980]: 2026-01-30 09:31:32.430215076 +0000 UTC m=+0.081964612 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:31:35 np0005601978 podman[214009]: 2026-01-30 09:31:35.418447852 +0000 UTC m=+0.071523036 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:31:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:46.232 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:31:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:46.239 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:31:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:46.239 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:31:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:46.283 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:31:47 np0005601978 podman[214034]: 2026-01-30 09:31:47.399562841 +0000 UTC m=+0.061087899 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, version=9.7, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, container_name=openstack_network_exporter, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc.)
Jan 30 04:31:49 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:49.241 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:31:49 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:49.241 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:31:49 np0005601978 podman[214056]: 2026-01-30 09:31:49.406235576 +0000 UTC m=+0.068081525 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 30 04:31:55 np0005601978 nova_compute[182955]: 2026-01-30 09:31:55.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:31:55 np0005601978 nova_compute[182955]: 2026-01-30 09:31:55.435 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:31:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:31:56 np0005601978 podman[214076]: 2026-01-30 09:31:56.398265052 +0000 UTC m=+0.057614337 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:31:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:57.341 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:31:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:57.341 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:31:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:31:57.341 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:31:59 np0005601978 podman[214100]: 2026-01-30 09:31:59.405197658 +0000 UTC m=+0.064505610 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 30 04:32:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:32:01Z|00148|chassis|WARN|Dropped 1 log messages in last 29 seconds (most recently, 29 seconds ago) due to excessive rate
Jan 30 04:32:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:32:01Z|00149|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:32:03 np0005601978 podman[214120]: 2026-01-30 09:32:03.503625556 +0000 UTC m=+0.155759003 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 30 04:32:06 np0005601978 podman[214145]: 2026-01-30 09:32:06.390563236 +0000 UTC m=+0.055154099 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:32:13 np0005601978 ovn_controller[95419]: 2026-01-30T09:32:13Z|00150|chassis|WARN|Dropped 50 log messages in last 10 seconds (most recently, 0 seconds ago) due to excessive rate
Jan 30 04:32:13 np0005601978 ovn_controller[95419]: 2026-01-30T09:32:13Z|00151|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.042 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:32:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:17.043 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:32:18 np0005601978 nova_compute[182955]: 2026-01-30 09:32:18.022 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:32:18 np0005601978 nova_compute[182955]: 2026-01-30 09:32:18.022 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:18 np0005601978 nova_compute[182955]: 2026-01-30 09:32:18.023 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 30 04:32:18 np0005601978 nova_compute[182955]: 2026-01-30 09:32:18.039 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:18 np0005601978 podman[214171]: 2026-01-30 09:32:18.433311462 +0000 UTC m=+0.083340448 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64)
Jan 30 04:32:20 np0005601978 podman[214191]: 2026-01-30 09:32:20.446686256 +0000 UTC m=+0.100704998 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 30 04:32:21 np0005601978 nova_compute[182955]: 2026-01-30 09:32:21.605 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 46.56 sec#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.052 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.053 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.053 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.054 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.077 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.078 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.079 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.079 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.080 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.080 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.081 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.081 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.082 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.109 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.110 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.110 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.111 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.293 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.295 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6025MB free_disk=73.36314010620117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.296 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.296 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.374 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.375 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.418 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.438 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.441 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:32:24 np0005601978 nova_compute[182955]: 2026-01-30 09:32:24.441 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:27 np0005601978 podman[214211]: 2026-01-30 09:32:27.397831023 +0000 UTC m=+0.058002246 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:32:30 np0005601978 podman[214236]: 2026-01-30 09:32:30.403020427 +0000 UTC m=+0.062465832 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:32:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:31.218 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:32:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:31.219 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:32:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:31.219 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:32:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:31.221 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:32:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:32.225 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:32:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:32.226 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:32:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:33.227 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:32:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:33.227 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:32:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:33.228 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:32:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:33.229 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:32:34 np0005601978 ovn_controller[95419]: 2026-01-30T09:32:34Z|00152|chassis|WARN|Dropped 30 log messages in last 20 seconds (most recently, 13 seconds ago) due to excessive rate
Jan 30 04:32:34 np0005601978 ovn_controller[95419]: 2026-01-30T09:32:34Z|00153|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:32:34 np0005601978 podman[214255]: 2026-01-30 09:32:34.443899791 +0000 UTC m=+0.100488914 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 30 04:32:35 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:35.234 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:32:35 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:35.235 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:32:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:37.237 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:32:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:37.237 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:32:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:37.238 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:32:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:37.238 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:32:37 np0005601978 podman[214282]: 2026-01-30 09:32:37.403394942 +0000 UTC m=+0.061143331 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:32:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:41.245 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:32:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:41.245 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:32:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:41.251 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:32:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:41.255 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:32:44 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:44.069 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:32:44 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:44.070 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:32:47 np0005601978 ovn_controller[95419]: 2026-01-30T09:32:47Z|00154|chassis|WARN|Dropped 1 log messages in last 14 seconds (most recently, 14 seconds ago) due to excessive rate
Jan 30 04:32:47 np0005601978 ovn_controller[95419]: 2026-01-30T09:32:47Z|00155|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:32:49 np0005601978 podman[214307]: 2026-01-30 09:32:49.409618182 +0000 UTC m=+0.069241044 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=)
Jan 30 04:32:51 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:51.072 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:32:51 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:51.073 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:32:51 np0005601978 podman[214329]: 2026-01-30 09:32:51.416560073 +0000 UTC m=+0.070169204 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:32:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:57.342 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:32:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:57.343 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:32:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:32:57.343 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:32:58 np0005601978 podman[214350]: 2026-01-30 09:32:58.410998007 +0000 UTC m=+0.060206669 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:32:59 np0005601978 nova_compute[182955]: 2026-01-30 09:32:59.817 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:32:59 np0005601978 ovn_controller[95419]: 2026-01-30T09:32:59Z|00156|chassis|WARN|Dropped 59 log messages in last 12 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 30 04:32:59 np0005601978 ovn_controller[95419]: 2026-01-30T09:32:59Z|00157|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:33:01 np0005601978 podman[214375]: 2026-01-30 09:33:01.410932526 +0000 UTC m=+0.066639260 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 30 04:33:05 np0005601978 podman[214394]: 2026-01-30 09:33:05.477719544 +0000 UTC m=+0.133619019 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:33:08 np0005601978 podman[214420]: 2026-01-30 09:33:08.388155092 +0000 UTC m=+0.047340643 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:33:12 np0005601978 ovn_controller[95419]: 2026-01-30T09:33:12Z|00158|chassis|WARN|Dropped 45 log messages in last 12 seconds (most recently, 0 seconds ago) due to excessive rate
Jan 30 04:33:12 np0005601978 ovn_controller[95419]: 2026-01-30T09:33:12Z|00159|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:33:20 np0005601978 podman[214444]: 2026-01-30 09:33:20.417074754 +0000 UTC m=+0.072921061 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, version=9.7, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 30 04:33:22 np0005601978 podman[214465]: 2026-01-30 09:33:22.419712393 +0000 UTC m=+0.075575102 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:33:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:23.698 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:33:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:23.698 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:33:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:23.700 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:33:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:23.700 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:33:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:24.707 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:33:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:24.708 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:33:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:25.709 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:33:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:25.709 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:33:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:25.709 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:33:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:25.710 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:33:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:27.714 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:33:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:27.715 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:33:29 np0005601978 podman[214485]: 2026-01-30 09:33:29.397385016 +0000 UTC m=+0.056834578 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:33:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:29.716 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:33:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:29.717 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:33:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:29.718 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:33:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:29.718 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:33:32 np0005601978 podman[214509]: 2026-01-30 09:33:32.38431503 +0000 UTC m=+0.046687590 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:33:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:33.721 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:33:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:33.726 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.311 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.312 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.312 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.502 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.502 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.502 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.503 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.503 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.503 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.504 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.504 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.504 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.554 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.555 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.555 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.555 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.676 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.677 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6026MB free_disk=73.36314010620117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.677 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.678 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.755 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.756 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.780 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.802 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.803 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:33:34 np0005601978 nova_compute[182955]: 2026-01-30 09:33:34.804 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:33:36 np0005601978 ovn_controller[95419]: 2026-01-30T09:33:36Z|00160|chassis|WARN|Dropped 7 log messages in last 24 seconds (most recently, 22 seconds ago) due to excessive rate
Jan 30 04:33:36 np0005601978 ovn_controller[95419]: 2026-01-30T09:33:36Z|00161|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:33:36 np0005601978 podman[214528]: 2026-01-30 09:33:36.391662908 +0000 UTC m=+0.056375387 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 30 04:33:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:37.722 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:33:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:37.723 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:33:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:37.726 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:33:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:37.727 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:33:38 np0005601978 nova_compute[182955]: 2026-01-30 09:33:38.404 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 46.80 sec#033[00m
Jan 30 04:33:39 np0005601978 podman[214554]: 2026-01-30 09:33:39.380784301 +0000 UTC m=+0.045369477 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:33:45 np0005601978 nova_compute[182955]: 2026-01-30 09:33:45.415 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:33:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:45.738 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:33:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:45.741 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:33:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:45.752 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:33:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:45.754 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:33:49 np0005601978 ovn_controller[95419]: 2026-01-30T09:33:49Z|00162|chassis|WARN|Dropped 28 log messages in last 13 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 30 04:33:49 np0005601978 ovn_controller[95419]: 2026-01-30T09:33:49Z|00163|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:33:50 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:50.756 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:33:50 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:50.756 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:33:51 np0005601978 podman[214578]: 2026-01-30 09:33:51.397526654 +0000 UTC m=+0.052182544 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9/ubi-minimal, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, io.buildah.version=1.33.7)
Jan 30 04:33:53 np0005601978 podman[214600]: 2026-01-30 09:33:53.412646219 +0000 UTC m=+0.073809672 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.753 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:33:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:33:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:57.343 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:33:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:57.343 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:33:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:33:57.343 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:00 np0005601978 podman[214620]: 2026-01-30 09:34:00.418553682 +0000 UTC m=+0.079487698 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:34:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:34:01Z|00164|chassis|WARN|Dropped 16 log messages in last 12 seconds (most recently, 3 seconds ago) due to excessive rate
Jan 30 04:34:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:34:01Z|00165|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:34:02 np0005601978 nova_compute[182955]: 2026-01-30 09:34:02.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:03 np0005601978 podman[214644]: 2026-01-30 09:34:03.427620373 +0000 UTC m=+0.088688587 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:34:06 np0005601978 nova_compute[182955]: 2026-01-30 09:34:06.435 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:06 np0005601978 nova_compute[182955]: 2026-01-30 09:34:06.435 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:34:06 np0005601978 nova_compute[182955]: 2026-01-30 09:34:06.435 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:34:07 np0005601978 podman[214663]: 2026-01-30 09:34:07.454692796 +0000 UTC m=+0.102928296 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 30 04:34:10 np0005601978 podman[214691]: 2026-01-30 09:34:10.442788907 +0000 UTC m=+0.097188571 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:34:22 np0005601978 podman[214715]: 2026-01-30 09:34:22.406540334 +0000 UTC m=+0.063239980 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7)
Jan 30 04:34:22 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:22.706 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:34:22 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:22.706 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:34:22 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:22.708 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:34:22 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:22.708 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:34:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:23.715 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:34:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:23.716 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:34:24 np0005601978 podman[214736]: 2026-01-30 09:34:24.422635611 +0000 UTC m=+0.070291488 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:34:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:24.716 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:34:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:24.717 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:34:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:24.718 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:34:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:24.718 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:34:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:26.723 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:34:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:26.724 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:34:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:28.725 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:34:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:28.726 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:34:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:28.726 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:34:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:28.727 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:34:31 np0005601978 podman[214757]: 2026-01-30 09:34:31.434189438 +0000 UTC m=+0.089315002 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:34:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:32.730 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:34:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:32.735 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:34:34 np0005601978 podman[214781]: 2026-01-30 09:34:34.390887391 +0000 UTC m=+0.054092082 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 30 04:34:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:36.734 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:34:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:36.734 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:34:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:36.740 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:34:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:36.741 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:34:38 np0005601978 ovn_controller[95419]: 2026-01-30T09:34:38Z|00166|chassis|WARN|Dropped 23 log messages in last 35 seconds (most recently, 26 seconds ago) due to excessive rate
Jan 30 04:34:38 np0005601978 ovn_controller[95419]: 2026-01-30T09:34:38Z|00167|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:34:38 np0005601978 podman[214801]: 2026-01-30 09:34:38.444043656 +0000 UTC m=+0.102890487 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.380 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 42.98 sec#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.381 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.382 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.382 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.382 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.383 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.383 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.383 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.383 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:41 np0005601978 podman[214828]: 2026-01-30 09:34:41.417584188 +0000 UTC m=+0.070183575 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.427 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.428 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.428 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.428 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.584 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.585 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6025MB free_disk=73.36314010620117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.585 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.586 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.740 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.740 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.825 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing inventories for resource provider 5912bad0-7860-4f37-8078-1db5720295f4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.919 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Updating ProviderTree inventory for provider 5912bad0-7860-4f37-8078-1db5720295f4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.920 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Updating inventory in ProviderTree for provider 5912bad0-7860-4f37-8078-1db5720295f4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.942 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing aggregate associations for resource provider 5912bad0-7860-4f37-8078-1db5720295f4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 30 04:34:41 np0005601978 nova_compute[182955]: 2026-01-30 09:34:41.982 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing trait associations for resource provider 5912bad0-7860-4f37-8078-1db5720295f4, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 30 04:34:42 np0005601978 nova_compute[182955]: 2026-01-30 09:34:42.011 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:34:42 np0005601978 nova_compute[182955]: 2026-01-30 09:34:42.029 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:34:42 np0005601978 nova_compute[182955]: 2026-01-30 09:34:42.030 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:34:42 np0005601978 nova_compute[182955]: 2026-01-30 09:34:42.030 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.445s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:34:46 np0005601978 nova_compute[182955]: 2026-01-30 09:34:46.025 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:34:51 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:51.837 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:34:51 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:51.906 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:34:53 np0005601978 podman[214852]: 2026-01-30 09:34:53.406112495 +0000 UTC m=+0.065340551 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, release=1769056855)
Jan 30 04:34:55 np0005601978 podman[214874]: 2026-01-30 09:34:55.437880906 +0000 UTC m=+0.097091738 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:34:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:57.343 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:34:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:57.343 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:34:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:34:57.343 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:00 np0005601978 nova_compute[182955]: 2026-01-30 09:35:00.429 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:35:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:35:01Z|00168|chassis|WARN|Dropped 2 log messages in last 23 seconds (most recently, 23 seconds ago) due to excessive rate
Jan 30 04:35:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:35:01Z|00169|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:35:02 np0005601978 podman[214897]: 2026-01-30 09:35:02.38221807 +0000 UTC m=+0.047659568 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:35:03 np0005601978 nova_compute[182955]: 2026-01-30 09:35:03.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:35:05 np0005601978 podman[214921]: 2026-01-30 09:35:05.421513133 +0000 UTC m=+0.075942123 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:35:08 np0005601978 nova_compute[182955]: 2026-01-30 09:35:08.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:35:08 np0005601978 nova_compute[182955]: 2026-01-30 09:35:08.434 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:35:08 np0005601978 nova_compute[182955]: 2026-01-30 09:35:08.434 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:35:09 np0005601978 podman[214940]: 2026-01-30 09:35:09.456591517 +0000 UTC m=+0.112640948 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 30 04:35:12 np0005601978 podman[214967]: 2026-01-30 09:35:12.405381821 +0000 UTC m=+0.069170061 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:35:16 np0005601978 ovn_controller[95419]: 2026-01-30T09:35:16Z|00170|chassis|WARN|Dropped 20 log messages in last 10 seconds (most recently, 3 seconds ago) due to excessive rate
Jan 30 04:35:16 np0005601978 ovn_controller[95419]: 2026-01-30T09:35:16Z|00171|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.046 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.047 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:35:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:17.048 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:35:24 np0005601978 podman[214991]: 2026-01-30 09:35:24.417577363 +0000 UTC m=+0.082450249 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., release=1769056855, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 30 04:35:26 np0005601978 podman[215014]: 2026-01-30 09:35:26.401780999 +0000 UTC m=+0.062122903 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 30 04:35:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:31.378 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:35:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:31.378 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:35:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:31.380 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:35:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:31.381 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:35:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:32.388 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:35:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:32.388 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:35:33 np0005601978 podman[215037]: 2026-01-30 09:35:33.389290492 +0000 UTC m=+0.052042273 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:35:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:33.389 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:35:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:33.389 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:35:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:33.390 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:35:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:33.390 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:35:35 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:35.395 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:35:35 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:35.397 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:35:36 np0005601978 podman[215061]: 2026-01-30 09:35:36.411786555 +0000 UTC m=+0.075275866 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:35:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:37.398 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:35:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:37.398 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:35:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:37.400 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:35:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:37.400 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:35:40 np0005601978 ovn_controller[95419]: 2026-01-30T09:35:40Z|00172|chassis|WARN|Dropped 13 log messages in last 24 seconds (most recently, 19 seconds ago) due to excessive rate
Jan 30 04:35:40 np0005601978 ovn_controller[95419]: 2026-01-30T09:35:40Z|00173|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:35:40 np0005601978 podman[215080]: 2026-01-30 09:35:40.421500293 +0000 UTC m=+0.079090508 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 30 04:35:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:41.405 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:35:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:41.405 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:35:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:41.410 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:35:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:41.411 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:35:43 np0005601978 podman[215106]: 2026-01-30 09:35:43.388984863 +0000 UTC m=+0.054453170 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:35:47 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:47.941 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:35:47 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:47.941 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:35:52 np0005601978 ovn_controller[95419]: 2026-01-30T09:35:52Z|00174|chassis|WARN|Dropped 12 log messages in last 13 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 30 04:35:52 np0005601978 ovn_controller[95419]: 2026-01-30T09:35:52Z|00175|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:35:52 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:52.944 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:35:52 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:52.945 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:35:55 np0005601978 podman[215131]: 2026-01-30 09:35:55.381145568 +0000 UTC m=+0.043348144 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, version=9.7, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, managed_by=edpm_ansible)
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:35:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:35:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:57.344 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:35:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:57.345 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:35:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:57.345 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:35:57 np0005601978 podman[215153]: 2026-01-30 09:35:57.410960003 +0000 UTC m=+0.068972476 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:35:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:59.597 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:b4:6e 2001:db8:0:1:f816:3eff:febf:b46e 2001:db8::f816:3eff:febf:b46e'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:febf:b46e/64 2001:db8::f816:3eff:febf:b46e/64', 'neutron:device_id': 'ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2b07532-97d0-4974-827c-4709f0bf52f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e38aee4e-ba47-49c3-9bdf-bed97e27acef, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=79c5a8be-b732-4d5f-86e3-0f3d570c8b43) old=Port_Binding(mac=['fa:16:3e:bf:b4:6e 2001:db8::f816:3eff:febf:b46e'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:febf:b46e/64', 'neutron:device_id': 'ovnmeta-f2b07532-97d0-4974-827c-4709f0bf52f6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2b07532-97d0-4974-827c-4709f0bf52f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69532d75aefe4fa6ada76bf1c1d1da9b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:35:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:59.599 104657 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 79c5a8be-b732-4d5f-86e3-0f3d570c8b43 in datapath f2b07532-97d0-4974-827c-4709f0bf52f6 updated#033[00m
Jan 30 04:35:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:59.603 104657 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f2b07532-97d0-4974-827c-4709f0bf52f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 30 04:35:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:35:59.604 104657 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpwoozzy_d/privsep.sock']#033[00m
Jan 30 04:36:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:00.247 104657 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 30 04:36:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:00.248 104657 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpwoozzy_d/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 30 04:36:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:00.123 215179 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 30 04:36:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:00.128 215179 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 30 04:36:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:00.132 215179 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Jan 30 04:36:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:00.132 215179 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215179#033[00m
Jan 30 04:36:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:00.251 215179 DEBUG oslo.privsep.daemon [-] privsep: reply[6bd6dc98-dfcc-4ec6-8d7f-5eb52add83ba]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:00.684 215179 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:00.685 215179 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:00.685 215179 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:00.783 215179 DEBUG oslo.privsep.daemon [-] privsep: reply[84e3f59f-ee87-46dd-9f16-46458f5f671c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 30 04:36:03 np0005601978 nova_compute[182955]: 2026-01-30 09:36:03.812 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:36:03 np0005601978 nova_compute[182955]: 2026-01-30 09:36:03.814 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:03 np0005601978 nova_compute[182955]: 2026-01-30 09:36:03.814 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:03 np0005601978 nova_compute[182955]: 2026-01-30 09:36:03.815 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:03 np0005601978 nova_compute[182955]: 2026-01-30 09:36:03.815 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:03 np0005601978 nova_compute[182955]: 2026-01-30 09:36:03.816 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:03 np0005601978 nova_compute[182955]: 2026-01-30 09:36:03.816 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:03 np0005601978 nova_compute[182955]: 2026-01-30 09:36:03.816 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:36:03 np0005601978 nova_compute[182955]: 2026-01-30 09:36:03.817 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:03 np0005601978 nova_compute[182955]: 2026-01-30 09:36:03.845 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:03 np0005601978 nova_compute[182955]: 2026-01-30 09:36:03.846 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:03 np0005601978 nova_compute[182955]: 2026-01-30 09:36:03.847 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:03 np0005601978 nova_compute[182955]: 2026-01-30 09:36:03.847 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:36:04 np0005601978 nova_compute[182955]: 2026-01-30 09:36:04.050 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:36:04 np0005601978 nova_compute[182955]: 2026-01-30 09:36:04.051 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5945MB free_disk=73.36009216308594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:36:04 np0005601978 nova_compute[182955]: 2026-01-30 09:36:04.051 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:04 np0005601978 nova_compute[182955]: 2026-01-30 09:36:04.052 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:04 np0005601978 nova_compute[182955]: 2026-01-30 09:36:04.162 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:36:04 np0005601978 nova_compute[182955]: 2026-01-30 09:36:04.162 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:36:04 np0005601978 podman[215184]: 2026-01-30 09:36:04.448548154 +0000 UTC m=+0.108263349 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:36:04 np0005601978 nova_compute[182955]: 2026-01-30 09:36:04.472 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:36:05 np0005601978 ovn_controller[95419]: 2026-01-30T09:36:05Z|00176|chassis|WARN|Dropped 27 log messages in last 12 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 30 04:36:05 np0005601978 ovn_controller[95419]: 2026-01-30T09:36:05Z|00177|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:36:05 np0005601978 nova_compute[182955]: 2026-01-30 09:36:05.088 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:36:05 np0005601978 nova_compute[182955]: 2026-01-30 09:36:05.089 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:36:05 np0005601978 nova_compute[182955]: 2026-01-30 09:36:05.089 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:06 np0005601978 nova_compute[182955]: 2026-01-30 09:36:06.885 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 45.50 sec#033[00m
Jan 30 04:36:07 np0005601978 podman[215208]: 2026-01-30 09:36:07.415520827 +0000 UTC m=+0.071588629 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 30 04:36:09 np0005601978 nova_compute[182955]: 2026-01-30 09:36:09.084 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:09 np0005601978 nova_compute[182955]: 2026-01-30 09:36:09.084 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.435 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.435 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.456 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.457 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.457 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.490 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.490 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.491 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.491 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.672 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.672 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5948MB free_disk=73.36012649536133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.673 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.673 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.737 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.738 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.758 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.774 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.775 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:36:10 np0005601978 nova_compute[182955]: 2026-01-30 09:36:10.775 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:11 np0005601978 podman[215228]: 2026-01-30 09:36:11.449156144 +0000 UTC m=+0.100630105 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Jan 30 04:36:13 np0005601978 nova_compute[182955]: 2026-01-30 09:36:13.753 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:14 np0005601978 podman[215255]: 2026-01-30 09:36:14.427416158 +0000 UTC m=+0.082786347 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:36:14 np0005601978 nova_compute[182955]: 2026-01-30 09:36:14.429 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:15 np0005601978 nova_compute[182955]: 2026-01-30 09:36:15.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:15 np0005601978 nova_compute[182955]: 2026-01-30 09:36:15.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:36:15 np0005601978 nova_compute[182955]: 2026-01-30 09:36:15.434 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:36:15 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:15.730 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:36:15 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:15.730 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:36:15 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:15.732 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:36:15 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:15.733 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:36:16 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:16.737 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:36:16 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:16.738 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:36:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:17.738 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:36:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:17.739 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:36:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:17.739 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:36:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:17.740 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:36:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:19.745 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:36:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:19.745 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:36:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:21.746 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:36:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:21.746 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:36:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:21.746 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:36:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:21.747 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:36:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:25.753 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:36:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:25.755 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:36:26 np0005601978 podman[215279]: 2026-01-30 09:36:26.437663465 +0000 UTC m=+0.094242442 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git)
Jan 30 04:36:28 np0005601978 podman[215300]: 2026-01-30 09:36:28.440398312 +0000 UTC m=+0.087114913 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 30 04:36:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:29.755 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:36:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:29.756 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:36:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:29.760 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:36:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:29.761 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:36:32 np0005601978 ovn_controller[95419]: 2026-01-30T09:36:32Z|00178|chassis|WARN|Dropped 7 log messages in last 28 seconds (most recently, 22 seconds ago) due to excessive rate
Jan 30 04:36:32 np0005601978 ovn_controller[95419]: 2026-01-30T09:36:32Z|00179|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:36:35 np0005601978 podman[215321]: 2026-01-30 09:36:35.410031148 +0000 UTC m=+0.074662312 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:36:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:37.770 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:36:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:37.780 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:36:38 np0005601978 podman[215346]: 2026-01-30 09:36:38.420372201 +0000 UTC m=+0.077314726 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:36:42 np0005601978 podman[215364]: 2026-01-30 09:36:42.439551182 +0000 UTC m=+0.098219757 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 30 04:36:45 np0005601978 podman[215390]: 2026-01-30 09:36:45.423126124 +0000 UTC m=+0.081281531 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:36:52 np0005601978 ovn_controller[95419]: 2026-01-30T09:36:52Z|00180|chassis|WARN|Dropped 24 log messages in last 20 seconds (most recently, 10 seconds ago) due to excessive rate
Jan 30 04:36:52 np0005601978 ovn_controller[95419]: 2026-01-30T09:36:52Z|00181|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:36:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:57.345 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:36:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:57.346 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:36:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:36:57.346 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:36:57 np0005601978 podman[215415]: 2026-01-30 09:36:57.416645331 +0000 UTC m=+0.071851935 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Jan 30 04:36:59 np0005601978 podman[215438]: 2026-01-30 09:36:59.416355324 +0000 UTC m=+0.076161268 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:37:01 np0005601978 nova_compute[182955]: 2026-01-30 09:37:01.429 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:04.754 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:37:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:04.755 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:37:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:04.756 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:37:04 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:04.756 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:37:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:05.764 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:37:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:05.764 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:37:06 np0005601978 podman[215458]: 2026-01-30 09:37:06.375367998 +0000 UTC m=+0.041187979 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:37:06 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:06.765 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:37:06 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:06.766 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:37:06 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:06.767 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:37:06 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:06.767 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:37:08 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:08.773 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:37:08 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:08.773 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:37:09 np0005601978 podman[215483]: 2026-01-30 09:37:09.416793678 +0000 UTC m=+0.076986788 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:37:10 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:10.775 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:37:10 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:10.775 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:37:10 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:10.776 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:37:10 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:10.776 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:37:13 np0005601978 ovn_controller[95419]: 2026-01-30T09:37:13Z|00182|chassis|WARN|Dropped 7 log messages in last 20 seconds (most recently, 12 seconds ago) due to excessive rate
Jan 30 04:37:13 np0005601978 ovn_controller[95419]: 2026-01-30T09:37:13Z|00183|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:37:13 np0005601978 podman[215502]: 2026-01-30 09:37:13.431356569 +0000 UTC m=+0.092342237 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:37:14 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:14.782 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:37:14 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:14.785 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:37:16 np0005601978 podman[215528]: 2026-01-30 09:37:16.404308995 +0000 UTC m=+0.057552221 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:37:18 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:18.784 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:37:18 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:18.784 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:37:18 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:18.791 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:37:18 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:18.791 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:37:28 np0005601978 podman[215552]: 2026-01-30 09:37:28.397817993 +0000 UTC m=+0.058810652 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter)
Jan 30 04:37:30 np0005601978 podman[215574]: 2026-01-30 09:37:30.41219996 +0000 UTC m=+0.072250654 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 30 04:37:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:33.882 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:37:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:33.890 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:37:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:33.891 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:37:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:33.930 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:37:37 np0005601978 podman[215597]: 2026-01-30 09:37:37.388421843 +0000 UTC m=+0.047831418 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:37:40 np0005601978 podman[215623]: 2026-01-30 09:37:40.409061904 +0000 UTC m=+0.068147466 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:37:41 np0005601978 nova_compute[182955]: 2026-01-30 09:37:41.604 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 44.71 sec#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.120 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.121 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.121 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.150 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.151 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.152 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.153 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.154 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.155 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.155 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.156 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.156 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.190 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.190 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.191 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.191 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.348 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.349 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5957MB free_disk=73.36014556884766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.349 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.349 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.798 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.799 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.844 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.859 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.863 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.863 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.514s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.864 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.864 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:37:42 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:42.892 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:37:42 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:42.893 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.895 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.896 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.896 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 30 04:37:42 np0005601978 nova_compute[182955]: 2026-01-30 09:37:42.918 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:44 np0005601978 ovn_controller[95419]: 2026-01-30T09:37:44Z|00184|chassis|WARN|Dropped 1 log messages in last 31 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:37:44 np0005601978 ovn_controller[95419]: 2026-01-30T09:37:44Z|00185|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:37:44 np0005601978 podman[215642]: 2026-01-30 09:37:44.463348167 +0000 UTC m=+0.122618683 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:37:47 np0005601978 podman[215669]: 2026-01-30 09:37:47.381326204 +0000 UTC m=+0.046126858 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:37:55 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:55.314 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:37:55 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:55.315 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:37:55 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:55.316 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:37:55 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:55.317 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.754 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:37:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:37:55 np0005601978 nova_compute[182955]: 2026-01-30 09:37:55.938 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:37:56 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:56.332 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:37:56 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:56.342 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:37:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:57.340 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:37:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:57.341 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:37:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:57.344 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:37:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:57.344 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:37:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:57.346 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:37:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:57.346 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:37:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:57.346 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:37:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:59.346 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:37:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:37:59.349 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:37:59 np0005601978 podman[215694]: 2026-01-30 09:37:59.410411465 +0000 UTC m=+0.073201818 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, container_name=openstack_network_exporter, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, vcs-type=git, version=9.7)
Jan 30 04:38:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:38:01Z|00186|chassis|WARN|Dropped 3 log messages in last 17 seconds (most recently, 6 seconds ago) due to excessive rate
Jan 30 04:38:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:38:01Z|00187|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:38:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:01.349 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:38:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:01.349 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:38:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:01.352 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:38:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:01.353 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:38:01 np0005601978 podman[215715]: 2026-01-30 09:38:01.439185646 +0000 UTC m=+0.092222013 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:38:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:05.358 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:38:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:05.358 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:38:07 np0005601978 nova_compute[182955]: 2026-01-30 09:38:07.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:38:08 np0005601978 podman[215736]: 2026-01-30 09:38:08.389647113 +0000 UTC m=+0.050905124 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:38:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:09.361 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:38:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:09.362 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:38:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:09.367 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:38:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:09.368 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:38:10 np0005601978 nova_compute[182955]: 2026-01-30 09:38:10.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:38:11 np0005601978 podman[215761]: 2026-01-30 09:38:11.421591404 +0000 UTC m=+0.081083576 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 30 04:38:11 np0005601978 nova_compute[182955]: 2026-01-30 09:38:11.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:38:11 np0005601978 nova_compute[182955]: 2026-01-30 09:38:11.434 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:38:11 np0005601978 nova_compute[182955]: 2026-01-30 09:38:11.434 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:38:15 np0005601978 ovn_controller[95419]: 2026-01-30T09:38:15Z|00188|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:38:15 np0005601978 podman[215780]: 2026-01-30 09:38:15.512632969 +0000 UTC m=+0.075535824 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.051 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.052 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:38:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:17.053 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:38:18 np0005601978 podman[215807]: 2026-01-30 09:38:18.383784463 +0000 UTC m=+0.042208525 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:38:30 np0005601978 podman[215832]: 2026-01-30 09:38:30.419749768 +0000 UTC m=+0.074254952 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, architecture=x86_64, distribution-scope=public, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, managed_by=edpm_ansible)
Jan 30 04:38:32 np0005601978 podman[215854]: 2026-01-30 09:38:32.441685565 +0000 UTC m=+0.100952543 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:38:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:33.383 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:38:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:33.412 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:38:39 np0005601978 podman[215874]: 2026-01-30 09:38:39.398793173 +0000 UTC m=+0.055227897 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:38:41 np0005601978 ovn_controller[95419]: 2026-01-30T09:38:41Z|00189|chassis|WARN|Dropped 1 log messages in last 27 seconds (most recently, 27 seconds ago) due to excessive rate
Jan 30 04:38:41 np0005601978 ovn_controller[95419]: 2026-01-30T09:38:41Z|00190|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:38:42 np0005601978 podman[215898]: 2026-01-30 09:38:42.440772595 +0000 UTC m=+0.094421346 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:38:46 np0005601978 podman[215918]: 2026-01-30 09:38:46.457086717 +0000 UTC m=+0.110828960 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:38:49 np0005601978 podman[215945]: 2026-01-30 09:38:49.396595731 +0000 UTC m=+0.055430289 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:38:54 np0005601978 ovn_controller[95419]: 2026-01-30T09:38:54Z|00191|chassis|WARN|Dropped 60 log messages in last 12 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 30 04:38:54 np0005601978 ovn_controller[95419]: 2026-01-30T09:38:54Z|00192|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:38:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:57.349 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:38:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:57.349 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:38:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:38:57.350 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:38:58 np0005601978 nova_compute[182955]: 2026-01-30 09:38:58.404 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 46.80 sec#033[00m
Jan 30 04:39:01 np0005601978 podman[215969]: 2026-01-30 09:39:01.38933623 +0000 UTC m=+0.053822433 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.7, vcs-type=git, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 30 04:39:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:02.258 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:39:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:02.259 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:39:03 np0005601978 podman[215990]: 2026-01-30 09:39:03.436257556 +0000 UTC m=+0.099165031 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute)
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.109 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.109 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.110 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.110 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.111 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.134 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.135 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.135 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.160 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.161 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.161 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.162 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.322 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.323 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5950MB free_disk=73.36014556884766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.324 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.324 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.387 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.387 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.409 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.421 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.424 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.424 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.747 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.748 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:07 np0005601978 nova_compute[182955]: 2026-01-30 09:39:07.793 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:10 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:10.261 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:39:10 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:10.262 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:39:10 np0005601978 podman[216010]: 2026-01-30 09:39:10.384155321 +0000 UTC m=+0.050027981 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:39:11 np0005601978 nova_compute[182955]: 2026-01-30 09:39:11.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:11 np0005601978 nova_compute[182955]: 2026-01-30 09:39:11.433 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:39:11 np0005601978 nova_compute[182955]: 2026-01-30 09:39:11.434 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:39:11 np0005601978 nova_compute[182955]: 2026-01-30 09:39:11.480 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:39:11 np0005601978 nova_compute[182955]: 2026-01-30 09:39:11.480 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:12 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:12.499 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:39:12 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:12.499 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:39:12 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:12.501 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:39:12 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:12.501 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:39:13 np0005601978 nova_compute[182955]: 2026-01-30 09:39:13.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:13 np0005601978 podman[216035]: 2026-01-30 09:39:13.438724856 +0000 UTC m=+0.092474169 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 30 04:39:13 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:13.504 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:39:13 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:13.505 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:39:14 np0005601978 nova_compute[182955]: 2026-01-30 09:39:14.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:14 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:14.506 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:39:14 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:14.506 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:39:14 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:14.506 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:39:14 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:14.507 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:39:14 np0005601978 nova_compute[182955]: 2026-01-30 09:39:14.548 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:39:14 np0005601978 nova_compute[182955]: 2026-01-30 09:39:14.548 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:39:14 np0005601978 nova_compute[182955]: 2026-01-30 09:39:14.549 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:39:14 np0005601978 nova_compute[182955]: 2026-01-30 09:39:14.549 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:39:14 np0005601978 nova_compute[182955]: 2026-01-30 09:39:14.718 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:39:14 np0005601978 nova_compute[182955]: 2026-01-30 09:39:14.719 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5960MB free_disk=73.36012649536133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:39:14 np0005601978 nova_compute[182955]: 2026-01-30 09:39:14.719 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:39:14 np0005601978 nova_compute[182955]: 2026-01-30 09:39:14.719 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:39:14 np0005601978 nova_compute[182955]: 2026-01-30 09:39:14.816 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:39:14 np0005601978 nova_compute[182955]: 2026-01-30 09:39:14.816 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:39:14 np0005601978 nova_compute[182955]: 2026-01-30 09:39:14.836 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:39:14 np0005601978 nova_compute[182955]: 2026-01-30 09:39:14.848 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:39:14 np0005601978 nova_compute[182955]: 2026-01-30 09:39:14.849 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:39:14 np0005601978 nova_compute[182955]: 2026-01-30 09:39:14.850 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:39:16 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:16.513 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:39:16 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:16.513 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:39:16 np0005601978 nova_compute[182955]: 2026-01-30 09:39:16.845 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:17 np0005601978 ovn_controller[95419]: 2026-01-30T09:39:17Z|00193|chassis|WARN|Dropped 18 log messages in last 23 seconds (most recently, 15 seconds ago) due to excessive rate
Jan 30 04:39:17 np0005601978 ovn_controller[95419]: 2026-01-30T09:39:17Z|00194|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:39:17 np0005601978 podman[216054]: 2026-01-30 09:39:17.423222644 +0000 UTC m=+0.084705123 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 30 04:39:18 np0005601978 nova_compute[182955]: 2026-01-30 09:39:18.432 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:18 np0005601978 nova_compute[182955]: 2026-01-30 09:39:18.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:18 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:18.514 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:39:18 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:18.514 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:39:18 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:18.516 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:39:18 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:18.516 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:39:19 np0005601978 nova_compute[182955]: 2026-01-30 09:39:19.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:39:19 np0005601978 nova_compute[182955]: 2026-01-30 09:39:19.433 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:39:20 np0005601978 podman[216081]: 2026-01-30 09:39:20.44243007 +0000 UTC m=+0.094468878 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:39:22 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:22.521 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:39:22 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:22.523 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:39:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:26.524 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:39:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:26.525 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:39:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:26.528 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:39:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:26.529 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:39:32 np0005601978 podman[216106]: 2026-01-30 09:39:32.415321312 +0000 UTC m=+0.074671122 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 30 04:39:34 np0005601978 podman[216128]: 2026-01-30 09:39:34.410092387 +0000 UTC m=+0.059165031 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Jan 30 04:39:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:34.541 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:39:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:34.542 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:39:41 np0005601978 podman[216150]: 2026-01-30 09:39:41.418291559 +0000 UTC m=+0.079225022 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:39:44 np0005601978 podman[216174]: 2026-01-30 09:39:44.412514226 +0000 UTC m=+0.066755301 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 30 04:39:48 np0005601978 ovn_controller[95419]: 2026-01-30T09:39:48Z|00195|chassis|WARN|Dropped 1 log messages in last 31 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:39:48 np0005601978 ovn_controller[95419]: 2026-01-30T09:39:48Z|00196|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:39:48 np0005601978 podman[216194]: 2026-01-30 09:39:48.396059002 +0000 UTC m=+0.060543164 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:39:51 np0005601978 podman[216221]: 2026-01-30 09:39:51.382223335 +0000 UTC m=+0.045402350 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:39:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:39:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:57.350 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:39:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:57.350 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:39:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:39:57.351 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:40:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:40:01Z|00197|chassis|WARN|Dropped 13 log messages in last 13 seconds (most recently, 4 seconds ago) due to excessive rate
Jan 30 04:40:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:40:01Z|00198|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:40:03 np0005601978 podman[216246]: 2026-01-30 09:40:03.402540717 +0000 UTC m=+0.067357148 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, config_id=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 30 04:40:05 np0005601978 podman[216268]: 2026-01-30 09:40:05.398163452 +0000 UTC m=+0.059403757 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:40:09 np0005601978 nova_compute[182955]: 2026-01-30 09:40:09.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:12 np0005601978 podman[216288]: 2026-01-30 09:40:12.415593616 +0000 UTC m=+0.070944033 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:40:12 np0005601978 nova_compute[182955]: 2026-01-30 09:40:12.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:12 np0005601978 nova_compute[182955]: 2026-01-30 09:40:12.435 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:40:12 np0005601978 nova_compute[182955]: 2026-01-30 09:40:12.435 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:40:12 np0005601978 nova_compute[182955]: 2026-01-30 09:40:12.437 182959 ERROR oslo.messaging._drivers.impl_rabbit [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] [34731e33-da8c-4268-9d71-432061e8bee1] AMQP server on rabbitmq-cell1.openstack.svc:5671 is unreachable: [Errno 110] Connection timed out. Trying again in 1 seconds.: TimeoutError: [Errno 110] Connection timed out#033[00m
Jan 30 04:40:13 np0005601978 nova_compute[182955]: 2026-01-30 09:40:13.475 182959 INFO oslo.messaging._drivers.impl_rabbit [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] [34731e33-da8c-4268-9d71-432061e8bee1] Reconnected to AMQP server on rabbitmq-cell1.openstack.svc:5671 via [amqp] client with port 43132.#033[00m
Jan 30 04:40:15 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:15.283 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:40:15 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:15.283 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:40:15 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:15.284 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:40:15 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:15.285 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:40:15 np0005601978 podman[216312]: 2026-01-30 09:40:15.420160307 +0000 UTC m=+0.066964392 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:40:16 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:16.290 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:40:16 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:16.291 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:40:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:17.292 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:40:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:17.292 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:40:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:17.293 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:40:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:17.293 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:40:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:19.300 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:40:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:19.301 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:40:19 np0005601978 ovn_controller[95419]: 2026-01-30T09:40:19Z|00199|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:40:19 np0005601978 podman[216331]: 2026-01-30 09:40:19.423898362 +0000 UTC m=+0.083275049 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:40:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:21.301 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:40:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:21.302 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:40:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:21.304 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:40:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:21.304 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.350 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.350 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.350 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.350 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.350 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.351 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.351 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.351 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.375 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.375 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.375 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.376 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.539 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.540 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5960MB free_disk=73.36012649536133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.540 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.540 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.721 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.722 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.783 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing inventories for resource provider 5912bad0-7860-4f37-8078-1db5720295f4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.847 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Updating ProviderTree inventory for provider 5912bad0-7860-4f37-8078-1db5720295f4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.848 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Updating inventory in ProviderTree for provider 5912bad0-7860-4f37-8078-1db5720295f4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.863 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing aggregate associations for resource provider 5912bad0-7860-4f37-8078-1db5720295f4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.888 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing trait associations for resource provider 5912bad0-7860-4f37-8078-1db5720295f4, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.908 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.927 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.929 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:40:21 np0005601978 nova_compute[182955]: 2026-01-30 09:40:21.930 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.389s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:40:22 np0005601978 podman[216357]: 2026-01-30 09:40:22.405890495 +0000 UTC m=+0.058433165 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:40:23 np0005601978 nova_compute[182955]: 2026-01-30 09:40:23.397 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 44.99 sec#033[00m
Jan 30 04:40:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:25.305 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:40:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:25.310 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:40:25 np0005601978 nova_compute[182955]: 2026-01-30 09:40:25.925 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:40:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:29.306 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:40:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:29.307 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:40:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:29.314 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:40:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:29.315 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:40:34 np0005601978 podman[216381]: 2026-01-30 09:40:34.420173456 +0000 UTC m=+0.067184297 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, name=ubi9/ubi-minimal)
Jan 30 04:40:36 np0005601978 podman[216404]: 2026-01-30 09:40:36.385303204 +0000 UTC m=+0.049973837 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:40:43 np0005601978 podman[216424]: 2026-01-30 09:40:43.405983165 +0000 UTC m=+0.065217150 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:40:44 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:44.409 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:40:44 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:44.411 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:40:44 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:44.417 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:40:44 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:44.418 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:40:46 np0005601978 podman[216448]: 2026-01-30 09:40:46.401970608 +0000 UTC m=+0.061070118 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 30 04:40:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:46.420 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:40:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:46.420 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:40:50 np0005601978 ovn_controller[95419]: 2026-01-30T09:40:50Z|00200|chassis|WARN|Dropped 1 log messages in last 31 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:40:50 np0005601978 ovn_controller[95419]: 2026-01-30T09:40:50Z|00201|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:40:50 np0005601978 podman[216467]: 2026-01-30 09:40:50.428846107 +0000 UTC m=+0.081453315 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 30 04:40:53 np0005601978 podman[216494]: 2026-01-30 09:40:53.380205413 +0000 UTC m=+0.044792032 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:40:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:57.353 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:40:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:57.354 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:40:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:40:57.354 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:41:05 np0005601978 podman[216517]: 2026-01-30 09:41:05.407256046 +0000 UTC m=+0.072601559 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855)
Jan 30 04:41:07 np0005601978 podman[216538]: 2026-01-30 09:41:07.407774536 +0000 UTC m=+0.059194282 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 30 04:41:07 np0005601978 nova_compute[182955]: 2026-01-30 09:41:07.429 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:10 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:10.067 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:41:10 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:10.067 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:41:10 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:10.069 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:41:10 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:10.069 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:41:14 np0005601978 podman[216558]: 2026-01-30 09:41:14.409568665 +0000 UTC m=+0.064891352 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.055 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:41:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:17.057 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:41:17 np0005601978 podman[216582]: 2026-01-30 09:41:17.382548909 +0000 UTC m=+0.047385575 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:41:21 np0005601978 ovn_controller[95419]: 2026-01-30T09:41:21Z|00202|chassis|WARN|Dropped 4 log messages in last 31 seconds (most recently, 20 seconds ago) due to excessive rate
Jan 30 04:41:21 np0005601978 ovn_controller[95419]: 2026-01-30T09:41:21Z|00203|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:41:21 np0005601978 podman[216602]: 2026-01-30 09:41:21.43042864 +0000 UTC m=+0.088787283 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:41:24 np0005601978 podman[216628]: 2026-01-30 09:41:24.436591312 +0000 UTC m=+0.093622312 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:41:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:25.178 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:41:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:25.183 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:41:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:25.188 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:41:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:25.189 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:41:35 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:35.191 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:41:35 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:35.192 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:41:36 np0005601978 podman[216653]: 2026-01-30 09:41:36.377308753 +0000 UTC m=+0.040025794 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, io.openshift.expose-services=, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 30 04:41:38 np0005601978 podman[216674]: 2026-01-30 09:41:38.402741141 +0000 UTC m=+0.063494817 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 30 04:41:45 np0005601978 podman[216695]: 2026-01-30 09:41:45.416712556 +0000 UTC m=+0.066804697 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.346 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.346 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.346 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.370 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.371 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.371 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.372 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.372 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.373 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.373 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.373 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.373 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.396 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.397 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.397 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.397 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.553 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.554 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5964MB free_disk=73.3598403930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.554 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.555 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.640 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.640 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.660 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.687 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.688 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:41:46 np0005601978 nova_compute[182955]: 2026-01-30 09:41:46.688 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:41:48 np0005601978 nova_compute[182955]: 2026-01-30 09:41:48.389 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 44.99 sec#033[00m
Jan 30 04:41:48 np0005601978 podman[216717]: 2026-01-30 09:41:48.403224399 +0000 UTC m=+0.058684628 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 30 04:41:52 np0005601978 ovn_controller[95419]: 2026-01-30T09:41:52Z|00204|chassis|WARN|Dropped 1 log messages in last 31 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:41:52 np0005601978 ovn_controller[95419]: 2026-01-30T09:41:52Z|00205|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:41:52 np0005601978 podman[216737]: 2026-01-30 09:41:52.410120772 +0000 UTC m=+0.073488450 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 30 04:41:55 np0005601978 podman[216764]: 2026-01-30 09:41:55.435005579 +0000 UTC m=+0.092052883 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:41:55 np0005601978 nova_compute[182955]: 2026-01-30 09:41:55.688 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:41:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:41:56 np0005601978 ovn_controller[95419]: 2026-01-30T09:41:56Z|00206|reconnect|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: no response to inactivity probe after 60 seconds, disconnecting
Jan 30 04:41:56 np0005601978 ovn_controller[95419]: 2026-01-30T09:41:56Z|00207|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped
Jan 30 04:41:56 np0005601978 ovn_controller[95419]: 2026-01-30T09:41:56Z|00208|main|INFO|OVNSB commit failed, force recompute next time.
Jan 30 04:41:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:57.356 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:41:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:57.357 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:41:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:41:57.357 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:41:57 np0005601978 ovn_controller[95419]: 2026-01-30T09:41:57Z|00209|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 30 04:41:57 np0005601978 ovn_controller[95419]: 2026-01-30T09:41:57Z|00210|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 30 04:42:07 np0005601978 podman[216788]: 2026-01-30 09:42:07.399793995 +0000 UTC m=+0.059177632 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9/ubi-minimal, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, vcs-type=git, build-date=2026-01-22T05:09:47Z)
Jan 30 04:42:09 np0005601978 podman[216809]: 2026-01-30 09:42:09.397497047 +0000 UTC m=+0.061286493 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 30 04:42:11 np0005601978 nova_compute[182955]: 2026-01-30 09:42:11.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:42:11 np0005601978 nova_compute[182955]: 2026-01-30 09:42:11.433 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:42:11 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:42:11.506 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:42:11 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:42:11.507 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:42:11 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:42:11.508 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:42:11 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:42:11.509 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:42:16 np0005601978 podman[216831]: 2026-01-30 09:42:16.42651015 +0000 UTC m=+0.086648571 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:42:19 np0005601978 podman[216855]: 2026-01-30 09:42:19.415249347 +0000 UTC m=+0.068101490 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 30 04:42:23 np0005601978 ovn_controller[95419]: 2026-01-30T09:42:23Z|00211|chassis|WARN|Dropped 6 log messages in last 26 seconds (most recently, 22 seconds ago) due to excessive rate
Jan 30 04:42:23 np0005601978 ovn_controller[95419]: 2026-01-30T09:42:23Z|00212|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:42:23 np0005601978 podman[216874]: 2026-01-30 09:42:23.450316235 +0000 UTC m=+0.104259701 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:42:26 np0005601978 podman[216901]: 2026-01-30 09:42:26.395748336 +0000 UTC m=+0.054117198 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:42:38 np0005601978 podman[216925]: 2026-01-30 09:42:38.432196861 +0000 UTC m=+0.093016797 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, release=1769056855, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7)
Jan 30 04:42:40 np0005601978 podman[216946]: 2026-01-30 09:42:40.4322793 +0000 UTC m=+0.092158755 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:42:42 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:42:42.617 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:42:42 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:42:42.619 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:42:45 np0005601978 ovn_controller[95419]: 2026-01-30T09:42:45Z|00213|chassis|WARN|Dropped 1 log messages in last 22 seconds (most recently, 22 seconds ago) due to excessive rate
Jan 30 04:42:45 np0005601978 ovn_controller[95419]: 2026-01-30T09:42:45Z|00214|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:42:47 np0005601978 podman[216967]: 2026-01-30 09:42:47.387204248 +0000 UTC m=+0.047317913 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:42:50 np0005601978 podman[216991]: 2026-01-30 09:42:50.384577395 +0000 UTC m=+0.039546273 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:42:54 np0005601978 podman[217011]: 2026-01-30 09:42:54.415534585 +0000 UTC m=+0.074187308 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true)
Jan 30 04:42:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:42:57.358 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:42:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:42:57.359 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:42:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:42:57.359 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:42:57 np0005601978 podman[217037]: 2026-01-30 09:42:57.383080386 +0000 UTC m=+0.046357440 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:42:57 np0005601978 ovn_controller[95419]: 2026-01-30T09:42:57Z|00215|chassis|WARN|Dropped 22 log messages in last 13 seconds (most recently, 4 seconds ago) due to excessive rate
Jan 30 04:42:57 np0005601978 ovn_controller[95419]: 2026-01-30T09:42:57Z|00216|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:43:03 np0005601978 nova_compute[182955]: 2026-01-30 09:43:03.140 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 44.75 sec#033[00m
Jan 30 04:43:07 np0005601978 nova_compute[182955]: 2026-01-30 09:43:07.235 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:43:07 np0005601978 nova_compute[182955]: 2026-01-30 09:43:07.236 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:07 np0005601978 nova_compute[182955]: 2026-01-30 09:43:07.236 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 30 04:43:07 np0005601978 nova_compute[182955]: 2026-01-30 09:43:07.249 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.264 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.265 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.278 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.278 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.278 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.293 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.293 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.293 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.294 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.294 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.294 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.294 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.294 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.294 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.323 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.324 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.324 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.324 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:43:09 np0005601978 podman[217063]: 2026-01-30 09:43:09.388929944 +0000 UTC m=+0.049522738 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=openstack_network_exporter, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.448 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.449 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5967MB free_disk=73.3598403930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.449 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.449 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.557 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.557 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.578 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.590 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.591 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:43:09 np0005601978 nova_compute[182955]: 2026-01-30 09:43:09.591 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:43:11 np0005601978 podman[217084]: 2026-01-30 09:43:11.415530569 +0000 UTC m=+0.069961675 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 30 04:43:18 np0005601978 podman[217104]: 2026-01-30 09:43:18.374200546 +0000 UTC m=+0.040516327 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:43:18 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:18.899 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:43:18 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:18.899 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:43:18 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:18.900 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:43:18 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:18.901 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:43:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:19.909 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:43:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:19.910 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:43:20 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:20.911 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:43:20 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:20.912 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:43:20 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:20.913 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:43:20 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:20.913 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:43:21 np0005601978 podman[217129]: 2026-01-30 09:43:21.408159034 +0000 UTC m=+0.060861543 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 30 04:43:22 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:22.920 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:43:22 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:22.920 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:43:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:24.922 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:43:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:24.923 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:43:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:24.923 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:43:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:24.923 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:43:25 np0005601978 ovn_controller[95419]: 2026-01-30T09:43:25Z|00217|chassis|WARN|Dropped 14 log messages in last 27 seconds (most recently, 16 seconds ago) due to excessive rate
Jan 30 04:43:25 np0005601978 ovn_controller[95419]: 2026-01-30T09:43:25Z|00218|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:43:25 np0005601978 podman[217149]: 2026-01-30 09:43:25.42625415 +0000 UTC m=+0.085427421 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:43:28 np0005601978 podman[217176]: 2026-01-30 09:43:28.436134572 +0000 UTC m=+0.084181691 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:43:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:28.927 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:43:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:28.931 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:43:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:32.935 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:43:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:32.935 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:43:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:32.936 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:43:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:32.937 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:43:40 np0005601978 podman[217201]: 2026-01-30 09:43:40.432976009 +0000 UTC m=+0.080619935 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, release=1769056855, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 30 04:43:42 np0005601978 podman[217222]: 2026-01-30 09:43:42.391058385 +0000 UTC m=+0.056916476 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 30 04:43:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:48.073 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:43:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:48.076 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:43:49 np0005601978 podman[217245]: 2026-01-30 09:43:49.405034292 +0000 UTC m=+0.068601082 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:43:52 np0005601978 podman[217269]: 2026-01-30 09:43:52.400620084 +0000 UTC m=+0.059460999 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:43:53 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:53.790 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:43:53 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:53.791 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:43:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:43:56 np0005601978 ovn_controller[95419]: 2026-01-30T09:43:56Z|00219|chassis|WARN|Dropped 1 log messages in last 31 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:43:56 np0005601978 ovn_controller[95419]: 2026-01-30T09:43:56Z|00220|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:43:56 np0005601978 podman[217288]: 2026-01-30 09:43:56.480381302 +0000 UTC m=+0.138014573 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:43:56 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:56.793 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:43:56 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:56.793 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:43:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:57.359 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:43:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:57.360 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:43:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:43:57.360 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:43:59 np0005601978 podman[217315]: 2026-01-30 09:43:59.398950439 +0000 UTC m=+0.054923269 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:44:08 np0005601978 ovn_controller[95419]: 2026-01-30T09:44:08Z|00221|chassis|WARN|Dropped 17 log messages in last 13 seconds (most recently, 2 seconds ago) due to excessive rate
Jan 30 04:44:08 np0005601978 ovn_controller[95419]: 2026-01-30T09:44:08Z|00222|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:44:09 np0005601978 nova_compute[182955]: 2026-01-30 09:44:09.593 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:09 np0005601978 nova_compute[182955]: 2026-01-30 09:44:09.593 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:09 np0005601978 nova_compute[182955]: 2026-01-30 09:44:09.593 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:44:09 np0005601978 nova_compute[182955]: 2026-01-30 09:44:09.594 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:44:11 np0005601978 podman[217339]: 2026-01-30 09:44:11.416584426 +0000 UTC m=+0.073913401 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.7, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, distribution-scope=public, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1769056855, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 30 04:44:13 np0005601978 podman[217360]: 2026-01-30 09:44:13.398286418 +0000 UTC m=+0.058075476 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.059 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:44:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:17.060 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:44:20 np0005601978 podman[217381]: 2026-01-30 09:44:20.410611374 +0000 UTC m=+0.058867115 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:44:20 np0005601978 ovn_controller[95419]: 2026-01-30T09:44:20Z|00223|chassis|WARN|Dropped 29 log messages in last 12 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 30 04:44:20 np0005601978 ovn_controller[95419]: 2026-01-30T09:44:20Z|00224|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:44:23 np0005601978 podman[217406]: 2026-01-30 09:44:23.377386106 +0000 UTC m=+0.043869239 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 30 04:44:27 np0005601978 podman[217426]: 2026-01-30 09:44:27.422370996 +0000 UTC m=+0.081718500 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 30 04:44:30 np0005601978 podman[217452]: 2026-01-30 09:44:30.404527153 +0000 UTC m=+0.064494531 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:44:33 np0005601978 ovn_controller[95419]: 2026-01-30T09:44:33Z|00225|chassis|WARN|Dropped 15 log messages in last 12 seconds (most recently, 2 seconds ago) due to excessive rate
Jan 30 04:44:33 np0005601978 ovn_controller[95419]: 2026-01-30T09:44:33Z|00226|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:44:35 np0005601978 nova_compute[182955]: 2026-01-30 09:44:35.814 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:44:35 np0005601978 nova_compute[182955]: 2026-01-30 09:44:35.814 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:35 np0005601978 nova_compute[182955]: 2026-01-30 09:44:35.815 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:35 np0005601978 nova_compute[182955]: 2026-01-30 09:44:35.815 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:35 np0005601978 nova_compute[182955]: 2026-01-30 09:44:35.815 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:35 np0005601978 nova_compute[182955]: 2026-01-30 09:44:35.816 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:35 np0005601978 nova_compute[182955]: 2026-01-30 09:44:35.816 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:35 np0005601978 nova_compute[182955]: 2026-01-30 09:44:35.816 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:44:35 np0005601978 nova_compute[182955]: 2026-01-30 09:44:35.816 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:44:35 np0005601978 nova_compute[182955]: 2026-01-30 09:44:35.839 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:35 np0005601978 nova_compute[182955]: 2026-01-30 09:44:35.840 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:35 np0005601978 nova_compute[182955]: 2026-01-30 09:44:35.840 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:35 np0005601978 nova_compute[182955]: 2026-01-30 09:44:35.840 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:44:36 np0005601978 nova_compute[182955]: 2026-01-30 09:44:36.003 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:44:36 np0005601978 nova_compute[182955]: 2026-01-30 09:44:36.004 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5960MB free_disk=73.3598403930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:44:36 np0005601978 nova_compute[182955]: 2026-01-30 09:44:36.005 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:36 np0005601978 nova_compute[182955]: 2026-01-30 09:44:36.005 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:36 np0005601978 nova_compute[182955]: 2026-01-30 09:44:36.070 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:44:36 np0005601978 nova_compute[182955]: 2026-01-30 09:44:36.070 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:44:36 np0005601978 nova_compute[182955]: 2026-01-30 09:44:36.092 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:44:36 np0005601978 nova_compute[182955]: 2026-01-30 09:44:36.110 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:44:36 np0005601978 nova_compute[182955]: 2026-01-30 09:44:36.111 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:44:36 np0005601978 nova_compute[182955]: 2026-01-30 09:44:36.111 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:37 np0005601978 nova_compute[182955]: 2026-01-30 09:44:37.860 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 44.71 sec#033[00m
Jan 30 04:44:42 np0005601978 podman[217475]: 2026-01-30 09:44:42.428798402 +0000 UTC m=+0.088201968 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Jan 30 04:44:44 np0005601978 podman[217496]: 2026-01-30 09:44:44.379377469 +0000 UTC m=+0.044839827 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 30 04:44:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:46.322 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:44:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:46.323 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:44:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:46.324 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:44:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:46.325 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:44:47 np0005601978 ovn_controller[95419]: 2026-01-30T09:44:47Z|00227|chassis|WARN|Dropped 6 log messages in last 14 seconds (most recently, 11 seconds ago) due to excessive rate
Jan 30 04:44:47 np0005601978 ovn_controller[95419]: 2026-01-30T09:44:47Z|00228|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:44:47 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:47.332 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:44:47 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:47.332 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:44:47 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:47.339 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:44:47 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:47.339 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:44:51 np0005601978 podman[217516]: 2026-01-30 09:44:51.380668962 +0000 UTC m=+0.045312219 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:44:54 np0005601978 podman[217540]: 2026-01-30 09:44:54.398564948 +0000 UTC m=+0.053431556 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 30 04:44:55 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:55.438 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:44:55 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:55.438 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:44:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:57.361 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:44:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:57.361 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:44:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:44:57.361 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:44:58 np0005601978 podman[217559]: 2026-01-30 09:44:58.397824459 +0000 UTC m=+0.060636610 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 30 04:45:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:45:01Z|00229|chassis|WARN|Dropped 25 log messages in last 14 seconds (most recently, 3 seconds ago) due to excessive rate
Jan 30 04:45:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:45:01Z|00230|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:45:01 np0005601978 podman[217585]: 2026-01-30 09:45:01.373428879 +0000 UTC m=+0.037653902 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:45:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:02.440 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:45:02 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:02.441 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:45:13 np0005601978 podman[217610]: 2026-01-30 09:45:13.392861008 +0000 UTC m=+0.052944694 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, distribution-scope=public, name=ubi9/ubi-minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Jan 30 04:45:15 np0005601978 podman[217631]: 2026-01-30 09:45:15.41378813 +0000 UTC m=+0.068513001 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 30 04:45:19 np0005601978 ovn_controller[95419]: 2026-01-30T09:45:19Z|00231|chassis|WARN|Dropped 9 log messages in last 16 seconds (most recently, 6 seconds ago) due to excessive rate
Jan 30 04:45:19 np0005601978 ovn_controller[95419]: 2026-01-30T09:45:19Z|00232|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:45:22 np0005601978 podman[217651]: 2026-01-30 09:45:22.386241452 +0000 UTC m=+0.049353926 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:45:25 np0005601978 podman[217673]: 2026-01-30 09:45:25.405387027 +0000 UTC m=+0.060790613 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 30 04:45:29 np0005601978 podman[217691]: 2026-01-30 09:45:29.392038625 +0000 UTC m=+0.059085762 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 30 04:45:32 np0005601978 podman[217717]: 2026-01-30 09:45:32.40081054 +0000 UTC m=+0.066629556 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:45:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:34.643 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:45:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:34.643 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:45:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:34.645 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:45:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:34.645 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:45:35 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:35.650 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:45:35 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:35.650 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:45:36 np0005601978 nova_compute[182955]: 2026-01-30 09:45:36.112 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:36 np0005601978 nova_compute[182955]: 2026-01-30 09:45:36.113 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:36.652 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:45:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:36.652 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:45:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:36.652 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:45:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:36.653 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:45:38 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:38.658 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:45:38 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:38.658 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:45:38 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:38.663 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:45:38 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:38.663 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:45:40 np0005601978 ovn_controller[95419]: 2026-01-30T09:45:40Z|00233|chassis|WARN|Dropped 13 log messages in last 22 seconds (most recently, 12 seconds ago) due to excessive rate
Jan 30 04:45:40 np0005601978 ovn_controller[95419]: 2026-01-30T09:45:40Z|00234|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:45:44 np0005601978 nova_compute[182955]: 2026-01-30 09:45:44.076 182959 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [SYS] unknown error (_ssl.c:2501)#033[00m
Jan 30 04:45:44 np0005601978 podman[217742]: 2026-01-30 09:45:44.42776773 +0000 UTC m=+0.085067641 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.7, name=ubi9/ubi-minimal, distribution-scope=public, container_name=openstack_network_exporter, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 30 04:45:46 np0005601978 podman[217764]: 2026-01-30 09:45:46.409271837 +0000 UTC m=+0.067326282 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.571 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.572 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.572 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.593 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.593 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.594 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.594 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.595 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.595 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.595 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.596 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.596 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.621 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.621 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.622 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.622 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.775 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.777 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5965MB free_disk=73.3598403930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.777 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.777 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.897 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.897 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:45:50 np0005601978 nova_compute[182955]: 2026-01-30 09:45:50.953 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing inventories for resource provider 5912bad0-7860-4f37-8078-1db5720295f4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 30 04:45:51 np0005601978 nova_compute[182955]: 2026-01-30 09:45:51.006 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Updating ProviderTree inventory for provider 5912bad0-7860-4f37-8078-1db5720295f4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 30 04:45:51 np0005601978 nova_compute[182955]: 2026-01-30 09:45:51.007 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Updating inventory in ProviderTree for provider 5912bad0-7860-4f37-8078-1db5720295f4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:45:51 np0005601978 nova_compute[182955]: 2026-01-30 09:45:51.023 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing aggregate associations for resource provider 5912bad0-7860-4f37-8078-1db5720295f4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 30 04:45:51 np0005601978 nova_compute[182955]: 2026-01-30 09:45:51.042 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing trait associations for resource provider 5912bad0-7860-4f37-8078-1db5720295f4, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 30 04:45:51 np0005601978 nova_compute[182955]: 2026-01-30 09:45:51.066 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:45:51 np0005601978 nova_compute[182955]: 2026-01-30 09:45:51.082 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:45:51 np0005601978 nova_compute[182955]: 2026-01-30 09:45:51.084 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:45:51 np0005601978 nova_compute[182955]: 2026-01-30 09:45:51.085 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.307s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:45:53 np0005601978 nova_compute[182955]: 2026-01-30 09:45:53.125 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 45.26 sec#033[00m
Jan 30 04:45:53 np0005601978 podman[217785]: 2026-01-30 09:45:53.405427304 +0000 UTC m=+0.060687490 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:45:55.762 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:45:56 np0005601978 podman[217812]: 2026-01-30 09:45:56.385140306 +0000 UTC m=+0.042587493 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 30 04:45:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:57.362 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:45:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:57.362 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:45:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:45:57.362 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:46:00 np0005601978 ovn_controller[95419]: 2026-01-30T09:46:00Z|00235|chassis|WARN|Dropped 5 log messages in last 19 seconds (most recently, 9 seconds ago) due to excessive rate
Jan 30 04:46:00 np0005601978 ovn_controller[95419]: 2026-01-30T09:46:00Z|00236|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:46:00 np0005601978 podman[217832]: 2026-01-30 09:46:00.41429086 +0000 UTC m=+0.073251375 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 30 04:46:03 np0005601978 podman[217858]: 2026-01-30 09:46:03.376018996 +0000 UTC m=+0.039729584 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:46:15 np0005601978 podman[217883]: 2026-01-30 09:46:15.430885691 +0000 UTC m=+0.091116238 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, build-date=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., release=1769056855, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter)
Jan 30 04:46:17 np0005601978 podman[217904]: 2026-01-30 09:46:17.400392688 +0000 UTC m=+0.062584538 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 30 04:46:24 np0005601978 podman[217925]: 2026-01-30 09:46:24.398527025 +0000 UTC m=+0.058038017 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:46:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:26.546 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:46:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:26.546 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:46:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:26.548 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:46:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:26.549 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:46:27 np0005601978 podman[217949]: 2026-01-30 09:46:27.402303098 +0000 UTC m=+0.062634288 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:46:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:27.556 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:46:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:27.556 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:46:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:28.558 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:46:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:28.558 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:46:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:28.558 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:46:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:28.558 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:46:30 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:30.562 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:46:30 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:30.563 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:46:31 np0005601978 ovn_controller[95419]: 2026-01-30T09:46:31Z|00237|chassis|WARN|Dropped 4 log messages in last 31 seconds (most recently, 28 seconds ago) due to excessive rate
Jan 30 04:46:31 np0005601978 ovn_controller[95419]: 2026-01-30T09:46:31Z|00238|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:46:31 np0005601978 podman[217970]: 2026-01-30 09:46:31.402662637 +0000 UTC m=+0.067207049 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 30 04:46:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:32.565 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:46:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:32.566 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:46:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:32.566 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:46:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:32.566 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:46:34 np0005601978 podman[217996]: 2026-01-30 09:46:34.405384744 +0000 UTC m=+0.062840774 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:46:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:36.575 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:46:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:36.576 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:46:40 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:40.576 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:46:40 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:40.577 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:46:40 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:40.578 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:46:40 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:40.578 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:46:45 np0005601978 ovn_controller[95419]: 2026-01-30T09:46:45Z|00239|chassis|WARN|Dropped 1 log messages in last 15 seconds (most recently, 15 seconds ago) due to excessive rate
Jan 30 04:46:45 np0005601978 ovn_controller[95419]: 2026-01-30T09:46:45Z|00240|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:46:46 np0005601978 podman[218020]: 2026-01-30 09:46:46.407543352 +0000 UTC m=+0.063952638 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, release=1769056855, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Jan 30 04:46:48 np0005601978 podman[218042]: 2026-01-30 09:46:48.408695785 +0000 UTC m=+0.074561558 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:46:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:48.595 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:46:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:48.598 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:46:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:48.604 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:46:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:48.606 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:46:51 np0005601978 nova_compute[182955]: 2026-01-30 09:46:51.087 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:46:51 np0005601978 nova_compute[182955]: 2026-01-30 09:46:51.088 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:46:51 np0005601978 nova_compute[182955]: 2026-01-30 09:46:51.088 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:46:51 np0005601978 nova_compute[182955]: 2026-01-30 09:46:51.089 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:46:53 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:53.607 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:46:53 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:53.608 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:46:55 np0005601978 podman[218062]: 2026-01-30 09:46:55.382299136 +0000 UTC m=+0.047445661 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:46:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:57.363 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:46:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:57.364 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:46:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:46:57.364 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:46:58 np0005601978 podman[218086]: 2026-01-30 09:46:58.378068155 +0000 UTC m=+0.041871935 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 30 04:46:58 np0005601978 ovn_controller[95419]: 2026-01-30T09:46:58Z|00241|chassis|WARN|Dropped 38 log messages in last 13 seconds (most recently, 5 seconds ago) due to excessive rate
Jan 30 04:46:58 np0005601978 ovn_controller[95419]: 2026-01-30T09:46:58Z|00242|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:47:02 np0005601978 podman[218105]: 2026-01-30 09:47:02.406665629 +0000 UTC m=+0.071552445 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 30 04:47:05 np0005601978 podman[218131]: 2026-01-30 09:47:05.399358915 +0000 UTC m=+0.055032915 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:47:15 np0005601978 ovn_controller[95419]: 2026-01-30T09:47:15Z|00243|chassis|WARN|Dropped 31 log messages in last 17 seconds (most recently, 9 seconds ago) due to excessive rate
Jan 30 04:47:15 np0005601978 ovn_controller[95419]: 2026-01-30T09:47:15Z|00244|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.062 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.062 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:47:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:17.063 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:47:17 np0005601978 podman[218155]: 2026-01-30 09:47:17.455600983 +0000 UTC m=+0.064406691 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1769056855, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Jan 30 04:47:18 np0005601978 nova_compute[182955]: 2026-01-30 09:47:18.116 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 44.99 sec#033[00m
Jan 30 04:47:19 np0005601978 podman[218178]: 2026-01-30 09:47:19.403931266 +0000 UTC m=+0.063837897 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.164 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.165 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.165 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.165 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.165 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.165 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.166 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.166 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.166 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.261 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.262 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.262 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.262 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.439 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.441 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5971MB free_disk=73.35982131958008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.441 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.441 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.570 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.571 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.594 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.616 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.619 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:47:20 np0005601978 nova_compute[182955]: 2026-01-30 09:47:20.620 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.465 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.466 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.466 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.490 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.490 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.491 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.514 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.515 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.515 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.515 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.712 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.713 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5971MB free_disk=73.3598403930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.713 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.714 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.792 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.792 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.811 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.834 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.835 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.835 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.835 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.835 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:47:24 np0005601978 nova_compute[182955]: 2026-01-30 09:47:24.867 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:47:26 np0005601978 podman[218199]: 2026-01-30 09:47:26.39461615 +0000 UTC m=+0.056422448 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:47:28 np0005601978 nova_compute[182955]: 2026-01-30 09:47:28.809 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:28 np0005601978 nova_compute[182955]: 2026-01-30 09:47:28.810 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:28 np0005601978 nova_compute[182955]: 2026-01-30 09:47:28.810 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:47:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:29.330 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:47:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:29.331 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:47:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:29.332 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:47:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:29.333 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:47:29 np0005601978 podman[218225]: 2026-01-30 09:47:29.39335454 +0000 UTC m=+0.053596689 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 30 04:47:30 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:30.338 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:47:30 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:30.338 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:47:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:31.340 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:47:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:31.340 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:47:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:31.341 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:47:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:31.341 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:47:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:33.345 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:47:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:33.345 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:47:33 np0005601978 ovn_controller[95419]: 2026-01-30T09:47:33Z|00245|chassis|WARN|Dropped 10 log messages in last 17 seconds (most recently, 14 seconds ago) due to excessive rate
Jan 30 04:47:33 np0005601978 ovn_controller[95419]: 2026-01-30T09:47:33Z|00246|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:47:33 np0005601978 podman[218247]: 2026-01-30 09:47:33.459528293 +0000 UTC m=+0.120367437 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 30 04:47:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:34.383 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:47:34 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:34.385 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:47:36 np0005601978 podman[218275]: 2026-01-30 09:47:36.392566752 +0000 UTC m=+0.055294230 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:47:41 np0005601978 nova_compute[182955]: 2026-01-30 09:47:41.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:47:41 np0005601978 nova_compute[182955]: 2026-01-30 09:47:41.434 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 30 04:47:43 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:43.489 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:47:43 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:43.489 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:47:46 np0005601978 ovn_controller[95419]: 2026-01-30T09:47:46Z|00247|chassis|WARN|Dropped 31 log messages in last 14 seconds (most recently, 2 seconds ago) due to excessive rate
Jan 30 04:47:46 np0005601978 ovn_controller[95419]: 2026-01-30T09:47:46Z|00248|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:47:48 np0005601978 podman[218299]: 2026-01-30 09:47:48.387658351 +0000 UTC m=+0.051899467 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9/ubi-minimal, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Jan 30 04:47:50 np0005601978 podman[218320]: 2026-01-30 09:47:50.419862886 +0000 UTC m=+0.074522166 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 30 04:47:50 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:50.493 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:47:50 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:50.494 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:47:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:47:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:57.364 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:47:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:57.365 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:47:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:47:57.365 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:47:57 np0005601978 podman[218341]: 2026-01-30 09:47:57.410238113 +0000 UTC m=+0.062781361 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:47:58 np0005601978 ovn_controller[95419]: 2026-01-30T09:47:58Z|00249|chassis|WARN|Dropped 41 log messages in last 12 seconds (most recently, 1 seconds ago) due to excessive rate
Jan 30 04:47:58 np0005601978 ovn_controller[95419]: 2026-01-30T09:47:58Z|00250|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:48:00 np0005601978 podman[218366]: 2026-01-30 09:48:00.405531441 +0000 UTC m=+0.068977723 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:48:04 np0005601978 podman[218385]: 2026-01-30 09:48:04.427241156 +0000 UTC m=+0.087144022 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 30 04:48:07 np0005601978 podman[218411]: 2026-01-30 09:48:07.385618369 +0000 UTC m=+0.044376016 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:48:16 np0005601978 nova_compute[182955]: 2026-01-30 09:48:16.931 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:16 np0005601978 nova_compute[182955]: 2026-01-30 09:48:16.932 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:19.314 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:48:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:19.314 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:48:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:19.316 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:48:19 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:19.316 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:48:19 np0005601978 podman[218435]: 2026-01-30 09:48:19.379066277 +0000 UTC m=+0.039483127 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Jan 30 04:48:20 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:20.321 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:48:20 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:20.322 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:48:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:21.322 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:48:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:21.323 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:48:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:21.324 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:48:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:21.324 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:48:21 np0005601978 podman[218458]: 2026-01-30 09:48:21.391628525 +0000 UTC m=+0.056804955 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:48:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:23.329 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:48:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:23.329 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:48:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:25.330 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:48:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:25.331 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:48:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:25.331 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:48:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:25.332 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:48:28 np0005601978 podman[218480]: 2026-01-30 09:48:28.4045328 +0000 UTC m=+0.067697041 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:48:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:29.339 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:48:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:29.339 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:48:31 np0005601978 podman[218504]: 2026-01-30 09:48:31.439455669 +0000 UTC m=+0.096375486 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 30 04:48:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:33.343 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:48:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:33.343 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:48:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:33.345 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:48:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:33.345 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.405 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.409 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 36.29 sec#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.409 182959 DEBUG nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.511 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.511 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.520 182959 DEBUG nova.virt.hardware [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.520 182959 INFO nova.compute.claims [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.645 182959 DEBUG nova.compute.provider_tree [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.659 182959 DEBUG nova.scheduler.client.report [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.678 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.679 182959 DEBUG nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.723 182959 DEBUG nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.724 182959 DEBUG nova.network.neutron [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.739 182959 INFO nova.virt.libvirt.driver [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.754 182959 DEBUG nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.847 182959 DEBUG nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.848 182959 DEBUG nova.virt.libvirt.driver [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.849 182959 INFO nova.virt.libvirt.driver [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Creating image(s)#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.849 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "/var/lib/nova/instances/cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.850 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "/var/lib/nova/instances/cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.850 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "/var/lib/nova/instances/cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.866 182959 DEBUG oslo_concurrency.processutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.909 182959 DEBUG oslo_concurrency.processutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.910 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "27f3756dd30074249f54b073a56d4c88beec31b4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.911 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.928 182959 DEBUG oslo_concurrency.processutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.971 182959 DEBUG oslo_concurrency.processutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:48:34 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.972 182959 DEBUG oslo_concurrency.processutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:48:35 np0005601978 nova_compute[182955]: 2026-01-30 09:48:34.999 182959 DEBUG oslo_concurrency.processutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4,backing_fmt=raw /var/lib/nova/instances/cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:48:35 np0005601978 nova_compute[182955]: 2026-01-30 09:48:35.001 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "27f3756dd30074249f54b073a56d4c88beec31b4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:35 np0005601978 nova_compute[182955]: 2026-01-30 09:48:35.001 182959 DEBUG oslo_concurrency.processutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:48:35 np0005601978 nova_compute[182955]: 2026-01-30 09:48:35.063 182959 DEBUG oslo_concurrency.processutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:48:35 np0005601978 nova_compute[182955]: 2026-01-30 09:48:35.065 182959 DEBUG nova.virt.disk.api [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Checking if we can resize image /var/lib/nova/instances/cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 30 04:48:35 np0005601978 nova_compute[182955]: 2026-01-30 09:48:35.066 182959 DEBUG oslo_concurrency.processutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 30 04:48:35 np0005601978 nova_compute[182955]: 2026-01-30 09:48:35.115 182959 DEBUG oslo_concurrency.processutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 30 04:48:35 np0005601978 nova_compute[182955]: 2026-01-30 09:48:35.116 182959 DEBUG nova.virt.disk.api [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Cannot resize image /var/lib/nova/instances/cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 30 04:48:35 np0005601978 nova_compute[182955]: 2026-01-30 09:48:35.117 182959 DEBUG nova.objects.instance [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lazy-loading 'migration_context' on Instance uuid cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 30 04:48:35 np0005601978 nova_compute[182955]: 2026-01-30 09:48:35.130 182959 DEBUG nova.virt.libvirt.driver [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 30 04:48:35 np0005601978 nova_compute[182955]: 2026-01-30 09:48:35.131 182959 DEBUG nova.virt.libvirt.driver [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Ensure instance console log exists: /var/lib/nova/instances/cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 30 04:48:35 np0005601978 nova_compute[182955]: 2026-01-30 09:48:35.131 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:35 np0005601978 nova_compute[182955]: 2026-01-30 09:48:35.132 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:35 np0005601978 nova_compute[182955]: 2026-01-30 09:48:35.132 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:35 np0005601978 ovn_controller[95419]: 2026-01-30T09:48:35Z|00251|chassis|WARN|Dropped 43 log messages in last 36 seconds (most recently, 26 seconds ago) due to excessive rate
Jan 30 04:48:35 np0005601978 ovn_controller[95419]: 2026-01-30T09:48:35Z|00252|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:48:35 np0005601978 podman[218540]: 2026-01-30 09:48:35.429108067 +0000 UTC m=+0.094909791 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 30 04:48:36 np0005601978 nova_compute[182955]: 2026-01-30 09:48:36.781 182959 DEBUG nova.network.neutron [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Successfully created port: de284ccb-9c09-4911-b0da-c32015ef19bc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 30 04:48:38 np0005601978 podman[218566]: 2026-01-30 09:48:38.393270481 +0000 UTC m=+0.049978152 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:48:38 np0005601978 nova_compute[182955]: 2026-01-30 09:48:38.760 182959 DEBUG nova.compute.manager [req-11199d15-287f-4271-bd9f-2b5e4e0f2758 req-3b9dbfb9-68fc-42f6-b6b9-2ffd2673f12c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Received event network-changed-de284ccb-9c09-4911-b0da-c32015ef19bc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 30 04:48:38 np0005601978 nova_compute[182955]: 2026-01-30 09:48:38.760 182959 DEBUG nova.compute.manager [req-11199d15-287f-4271-bd9f-2b5e4e0f2758 req-3b9dbfb9-68fc-42f6-b6b9-2ffd2673f12c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Refreshing instance network info cache due to event network-changed-de284ccb-9c09-4911-b0da-c32015ef19bc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 30 04:48:38 np0005601978 nova_compute[182955]: 2026-01-30 09:48:38.761 182959 DEBUG oslo_concurrency.lockutils [req-11199d15-287f-4271-bd9f-2b5e4e0f2758 req-3b9dbfb9-68fc-42f6-b6b9-2ffd2673f12c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquiring lock "refresh_cache-cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:48:38 np0005601978 nova_compute[182955]: 2026-01-30 09:48:38.761 182959 DEBUG oslo_concurrency.lockutils [req-11199d15-287f-4271-bd9f-2b5e4e0f2758 req-3b9dbfb9-68fc-42f6-b6b9-2ffd2673f12c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Acquired lock "refresh_cache-cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:48:38 np0005601978 nova_compute[182955]: 2026-01-30 09:48:38.761 182959 DEBUG nova.network.neutron [req-11199d15-287f-4271-bd9f-2b5e4e0f2758 req-3b9dbfb9-68fc-42f6-b6b9-2ffd2673f12c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Refreshing network info cache for port de284ccb-9c09-4911-b0da-c32015ef19bc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.299 182959 DEBUG nova.network.neutron [req-11199d15-287f-4271-bd9f-2b5e4e0f2758 req-3b9dbfb9-68fc-42f6-b6b9-2ffd2673f12c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port de284ccb-9c09-4911-b0da-c32015ef19bc, please check neutron logs for more information.
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager Traceback (most recent call last):
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager     nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager     created_port_ids = self._update_ports_for_instance(
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager     vif.destroy()
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager     self.force_reraise()
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager     raise self.value
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager     updated_port = self._update_port(
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager     _ensure_no_port_binding_failure(port)
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager     raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port de284ccb-9c09-4911-b0da-c32015ef19bc, please check neutron logs for more information.
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.364 182959 ERROR nova.compute.manager #033[00m
Jan 30 04:48:39 np0005601978 nova_compute[182955]: Traceback (most recent call last):
Jan 30 04:48:39 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/eventlet/hubs/poll.py", line 111, in wait
Jan 30 04:48:39 np0005601978 nova_compute[182955]:    listener.cb(fileno)
Jan 30 04:48:39 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 221, in main
Jan 30 04:48:39 np0005601978 nova_compute[182955]:    result = function(*args, **kwargs)
Jan 30 04:48:39 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/utils.py", line 654, in context_wrapper
Jan 30 04:48:39 np0005601978 nova_compute[182955]:    return func(*args, **kwargs)
Jan 30 04:48:39 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1982, in _allocate_network_async
Jan 30 04:48:39 np0005601978 nova_compute[182955]:    raise e
Jan 30 04:48:39 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:48:39 np0005601978 nova_compute[182955]:    nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:48:39 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:48:39 np0005601978 nova_compute[182955]:    created_port_ids = self._update_ports_for_instance(
Jan 30 04:48:39 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:48:39 np0005601978 nova_compute[182955]:    vif.destroy()
Jan 30 04:48:39 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:48:39 np0005601978 nova_compute[182955]:    self.force_reraise()
Jan 30 04:48:39 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:48:39 np0005601978 nova_compute[182955]:    raise self.value
Jan 30 04:48:39 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:48:39 np0005601978 nova_compute[182955]:    updated_port = self._update_port(
Jan 30 04:48:39 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:48:39 np0005601978 nova_compute[182955]:    _ensure_no_port_binding_failure(port)
Jan 30 04:48:39 np0005601978 nova_compute[182955]:  File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:48:39 np0005601978 nova_compute[182955]:    raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:48:39 np0005601978 nova_compute[182955]: nova.exception.PortBindingFailed: Binding failed for port de284ccb-9c09-4911-b0da-c32015ef19bc, please check neutron logs for more information.
Jan 30 04:48:39 np0005601978 nova_compute[182955]: Removing descriptor: 21
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port de284ccb-9c09-4911-b0da-c32015ef19bc, please check neutron logs for more information.
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Traceback (most recent call last):
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     yield resources
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     self.driver.spawn(context, instance, image_meta,
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4407, in spawn
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     xml = self._get_guest_xml(context, instance, network_info,
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7538, in _get_guest_xml
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     network_info_str = str(network_info)
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 620, in __str__
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     return self._sync_wrapper(fn, *args, **kwargs)
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 603, in _sync_wrapper
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     self.wait()
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 635, in wait
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     self[:] = self._gt.wait()
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 181, in wait
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     return self._exit_event.wait()
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/eventlet/event.py", line 125, in wait
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     result = hub.switch()
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     return self.greenlet.switch()
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 221, in main
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     result = function(*args, **kwargs)
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/utils.py", line 654, in context_wrapper
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     return func(*args, **kwargs)
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1982, in _allocate_network_async
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     raise e
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     created_port_ids = self._update_ports_for_instance(
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     vif.destroy()
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     self.force_reraise()
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     raise self.value
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     updated_port = self._update_port(
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     _ensure_no_port_binding_failure(port)
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] nova.exception.PortBindingFailed: Binding failed for port de284ccb-9c09-4911-b0da-c32015ef19bc, please check neutron logs for more information.
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.365 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] #033[00m
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.366 182959 INFO nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Terminating instance#033[00m
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.367 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "refresh_cache-cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.621 182959 DEBUG nova.network.neutron [req-11199d15-287f-4271-bd9f-2b5e4e0f2758 req-3b9dbfb9-68fc-42f6-b6b9-2ffd2673f12c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.640 182959 DEBUG oslo_concurrency.lockutils [req-11199d15-287f-4271-bd9f-2b5e4e0f2758 req-3b9dbfb9-68fc-42f6-b6b9-2ffd2673f12c dffdcf0e45b54d0a9cd22b90229fc150 482929733b9142a8b6e55ec228598291 - - default default] Releasing lock "refresh_cache-cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.641 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquired lock "refresh_cache-cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.641 182959 DEBUG nova.network.neutron [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:48:39 np0005601978 nova_compute[182955]: 2026-01-30 09:48:39.747 182959 DEBUG nova.network.neutron [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.000 182959 DEBUG nova.network.neutron [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.015 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Releasing lock "refresh_cache-cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.016 182959 DEBUG nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.021 182959 DEBUG nova.virt.libvirt.driver [-] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.021 182959 INFO nova.virt.libvirt.driver [-] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Instance destroyed successfully.#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.022 182959 INFO nova.virt.libvirt.driver [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Deleting instance files /var/lib/nova/instances/cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d_del#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.023 182959 INFO nova.virt.libvirt.driver [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Deletion of /var/lib/nova/instances/cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d_del complete#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.075 182959 INFO nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Took 0.06 seconds to destroy the instance on the hypervisor.#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.076 182959 DEBUG oslo.service.loopingcall [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.077 182959 DEBUG nova.compute.manager [-] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.077 182959 DEBUG nova.network.neutron [-] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.473 182959 DEBUG nova.network.neutron [-] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.490 182959 DEBUG nova.network.neutron [-] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.511 182959 INFO nova.compute.manager [-] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Took 0.43 seconds to deallocate network for instance.#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.514 182959 DEBUG nova.compute.claims [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Aborting claim: <nova.compute.claims.Claim object at 0x7f61002406a0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.515 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.515 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.682 182959 DEBUG nova.compute.provider_tree [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.697 182959 DEBUG nova.scheduler.client.report [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port de284ccb-9c09-4911-b0da-c32015ef19bc, please check neutron logs for more information.
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Traceback (most recent call last):
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     self.driver.spawn(context, instance, image_meta,
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4407, in spawn
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     xml = self._get_guest_xml(context, instance, network_info,
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7538, in _get_guest_xml
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     network_info_str = str(network_info)
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 620, in __str__
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     return self._sync_wrapper(fn, *args, **kwargs)
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 603, in _sync_wrapper
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     self.wait()
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/model.py", line 635, in wait
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     self[:] = self._gt.wait()
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 181, in wait
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     return self._exit_event.wait()
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/eventlet/event.py", line 125, in wait
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     result = hub.switch()
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     return self.greenlet.switch()
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/eventlet/greenthread.py", line 221, in main
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     result = function(*args, **kwargs)
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/utils.py", line 654, in context_wrapper
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     return func(*args, **kwargs)
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1982, in _allocate_network_async
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     raise e
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 1960, in _allocate_network_async
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     nwinfo = self.network_api.allocate_for_instance(
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1229, in allocate_for_instance
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     created_port_ids = self._update_ports_for_instance(
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1371, in _update_ports_for_instance
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     vif.destroy()
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     self.force_reraise()
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     raise self.value
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1340, in _update_ports_for_instance
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     updated_port = self._update_port(
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 585, in _update_port
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     _ensure_no_port_binding_failure(port)
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]   File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d]     raise exception.PortBindingFailed(port_id=port['id'])
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] nova.exception.PortBindingFailed: Binding failed for port de284ccb-9c09-4911-b0da-c32015ef19bc, please check neutron logs for more information.
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.725 182959 ERROR nova.compute.manager [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] #033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.727 182959 DEBUG nova.compute.utils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Binding failed for port de284ccb-9c09-4911-b0da-c32015ef19bc, please check neutron logs for more information. notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.728 182959 DEBUG nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Retry info not present, will not reschedule _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2437#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.728 182959 DEBUG nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.728 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquiring lock "refresh_cache-cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.728 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Acquired lock "refresh_cache-cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.729 182959 DEBUG nova.network.neutron [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 30 04:48:40 np0005601978 nova_compute[182955]: 2026-01-30 09:48:40.892 182959 DEBUG nova.network.neutron [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:48:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:41.351 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:48:41 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:41.358 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:48:41 np0005601978 nova_compute[182955]: 2026-01-30 09:48:41.471 182959 DEBUG nova.network.neutron [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:48:41 np0005601978 nova_compute[182955]: 2026-01-30 09:48:41.494 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Releasing lock "refresh_cache-cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 30 04:48:41 np0005601978 nova_compute[182955]: 2026-01-30 09:48:41.494 182959 DEBUG nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012#033[00m
Jan 30 04:48:41 np0005601978 nova_compute[182955]: 2026-01-30 09:48:41.494 182959 DEBUG nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 30 04:48:41 np0005601978 nova_compute[182955]: 2026-01-30 09:48:41.494 182959 DEBUG nova.network.neutron [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 30 04:48:41 np0005601978 nova_compute[182955]: 2026-01-30 09:48:41.620 182959 DEBUG nova.network.neutron [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 30 04:48:41 np0005601978 nova_compute[182955]: 2026-01-30 09:48:41.635 182959 DEBUG nova.network.neutron [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 30 04:48:41 np0005601978 nova_compute[182955]: 2026-01-30 09:48:41.648 182959 INFO nova.compute.manager [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] [instance: cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d] Took 0.15 seconds to deallocate network for instance.#033[00m
Jan 30 04:48:41 np0005601978 nova_compute[182955]: 2026-01-30 09:48:41.816 182959 INFO nova.scheduler.client.report [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Deleted allocations for instance cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d#033[00m
Jan 30 04:48:41 np0005601978 nova_compute[182955]: 2026-01-30 09:48:41.817 182959 DEBUG oslo_concurrency.lockutils [None req-86275373-f42a-45c1-a9ea-917dc8e06c76 bc74e93bd60c4618af5bc22cb2c9535e d1f6575f78b14c7d87ed252e603d0a6d - - default default] Lock "cd0d8177-86b2-40a4-9ee4-cc4fffc20c9d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.885s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:45.136 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:48:45 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:45.136 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:48:47 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:47.139 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:48:47 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:47.140 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:48:47 np0005601978 ovn_controller[95419]: 2026-01-30T09:48:47Z|00253|chassis|WARN|Dropped 20 log messages in last 13 seconds (most recently, 2 seconds ago) due to excessive rate
Jan 30 04:48:47 np0005601978 ovn_controller[95419]: 2026-01-30T09:48:47Z|00254|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:48:50 np0005601978 podman[218590]: 2026-01-30 09:48:50.3748532 +0000 UTC m=+0.040747137 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, release=1769056855, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 30 04:48:52 np0005601978 podman[218611]: 2026-01-30 09:48:52.388324273 +0000 UTC m=+0.049584094 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 30 04:48:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:57.365 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:48:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:57.366 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:48:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:48:57.366 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:48:59 np0005601978 podman[218631]: 2026-01-30 09:48:59.378296139 +0000 UTC m=+0.039663244 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:49:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:49:01Z|00255|chassis|WARN|Dropped 27 log messages in last 13 seconds (most recently, 5 seconds ago) due to excessive rate
Jan 30 04:49:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:49:01Z|00256|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:49:02 np0005601978 podman[218655]: 2026-01-30 09:49:02.39938152 +0000 UTC m=+0.047282968 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 30 04:49:06 np0005601978 podman[218675]: 2026-01-30 09:49:06.394210288 +0000 UTC m=+0.056494620 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 30 04:49:07 np0005601978 nova_compute[182955]: 2026-01-30 09:49:07.533 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:07 np0005601978 nova_compute[182955]: 2026-01-30 09:49:07.533 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:07 np0005601978 nova_compute[182955]: 2026-01-30 09:49:07.534 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:49:07 np0005601978 nova_compute[182955]: 2026-01-30 09:49:07.534 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:49:09 np0005601978 podman[218701]: 2026-01-30 09:49:09.38080626 +0000 UTC m=+0.042283459 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:49:13 np0005601978 ovn_controller[95419]: 2026-01-30T09:49:13Z|00257|chassis|WARN|Dropped 4 log messages in last 10 seconds (most recently, 7 seconds ago) due to excessive rate
Jan 30 04:49:13 np0005601978 ovn_controller[95419]: 2026-01-30T09:49:13Z|00258|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:49:20 np0005601978 nova_compute[182955]: 2026-01-30 09:49:20.997 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:20.997 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:20.998 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:20.998 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:20.998 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:20.998 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:20.998 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.022 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.022 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.022 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.051 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.052 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.052 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.052 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.194 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.195 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5958MB free_disk=73.35982131958008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.195 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.196 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.258 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.259 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.278 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.289 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.307 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.307 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:49:21 np0005601978 nova_compute[182955]: 2026-01-30 09:49:21.307 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:21 np0005601978 podman[218726]: 2026-01-30 09:49:21.403554444 +0000 UTC m=+0.062967725 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, managed_by=edpm_ansible, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 30 04:49:23 np0005601978 nova_compute[182955]: 2026-01-30 09:49:23.044 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 18.62 sec#033[00m
Jan 30 04:49:23 np0005601978 podman[218747]: 2026-01-30 09:49:23.398848436 +0000 UTC m=+0.056858948 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 30 04:49:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:23.955 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:49:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:23.955 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:49:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:23.956 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:49:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:23.956 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:49:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:24.961 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:49:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:24.962 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:49:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:25.962 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:49:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:25.963 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:49:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:25.964 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:49:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:25.965 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.205 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.206 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.224 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.224 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.224 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.237 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.238 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.258 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.259 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.259 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.260 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.401 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.402 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5955MB free_disk=73.35982131958008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.402 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.402 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.480 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.480 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.497 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.513 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.514 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.514 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:49:26 np0005601978 nova_compute[182955]: 2026-01-30 09:49:26.709 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:27.971 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:49:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:27.971 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:49:28 np0005601978 nova_compute[182955]: 2026-01-30 09:49:28.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:29.972 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:49:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:29.973 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:49:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:29.974 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:49:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:29.975 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:49:30 np0005601978 podman[218767]: 2026-01-30 09:49:30.405659606 +0000 UTC m=+0.061940730 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:49:30 np0005601978 nova_compute[182955]: 2026-01-30 09:49:30.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:49:30 np0005601978 nova_compute[182955]: 2026-01-30 09:49:30.434 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:49:33 np0005601978 podman[218791]: 2026-01-30 09:49:33.40437162 +0000 UTC m=+0.068044507 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:49:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:33.990 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:49:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:33.990 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:49:37 np0005601978 ovn_controller[95419]: 2026-01-30T09:49:37Z|00259|chassis|WARN|Dropped 3 log messages in last 24 seconds (most recently, 23 seconds ago) due to excessive rate
Jan 30 04:49:37 np0005601978 ovn_controller[95419]: 2026-01-30T09:49:37Z|00260|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:49:37 np0005601978 podman[218810]: 2026-01-30 09:49:37.399898834 +0000 UTC m=+0.061141981 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:49:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:37.993 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:49:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:37.994 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:49:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:37.995 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:49:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:37.995 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:49:40 np0005601978 podman[218836]: 2026-01-30 09:49:40.404194542 +0000 UTC m=+0.058575770 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:49:52 np0005601978 podman[218862]: 2026-01-30 09:49:52.414489846 +0000 UTC m=+0.071608533 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, release=1769056855, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc.)
Jan 30 04:49:53 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:53.129 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:49:53 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:53.131 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:49:54 np0005601978 podman[218883]: 2026-01-30 09:49:54.401831207 +0000 UTC m=+0.061864819 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.761 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.762 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:49:55.762 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:49:56 np0005601978 ovn_controller[95419]: 2026-01-30T09:49:56Z|00261|chassis|WARN|Dropped 1 log messages in last 19 seconds (most recently, 19 seconds ago) due to excessive rate
Jan 30 04:49:56 np0005601978 ovn_controller[95419]: 2026-01-30T09:49:56Z|00262|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:49:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:57.367 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:49:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:57.368 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:49:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:49:57.368 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:50:01 np0005601978 podman[218904]: 2026-01-30 09:50:01.402616203 +0000 UTC m=+0.066853999 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:50:04 np0005601978 podman[218928]: 2026-01-30 09:50:04.413858987 +0000 UTC m=+0.075476136 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 30 04:50:08 np0005601978 ovn_controller[95419]: 2026-01-30T09:50:08Z|00263|chassis|WARN|Dropped 5 log messages in last 12 seconds (most recently, 5 seconds ago) due to excessive rate
Jan 30 04:50:08 np0005601978 ovn_controller[95419]: 2026-01-30T09:50:08Z|00264|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:50:08 np0005601978 podman[218949]: 2026-01-30 09:50:08.424304201 +0000 UTC m=+0.084123935 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 30 04:50:11 np0005601978 podman[218972]: 2026-01-30 09:50:11.404894738 +0000 UTC m=+0.064609884 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:50:16 np0005601978 nova_compute[182955]: 2026-01-30 09:50:16.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.065 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:50:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:17.066 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:50:19 np0005601978 nova_compute[182955]: 2026-01-30 09:50:19.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:20 np0005601978 nova_compute[182955]: 2026-01-30 09:50:20.429 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:20 np0005601978 nova_compute[182955]: 2026-01-30 09:50:20.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:23 np0005601978 podman[218996]: 2026-01-30 09:50:23.409268732 +0000 UTC m=+0.063343804 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter)
Jan 30 04:50:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:23.474 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:50:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:23.474 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:50:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:23.476 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:50:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:23.476 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:50:24 np0005601978 nova_compute[182955]: 2026-01-30 09:50:24.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:24 np0005601978 nova_compute[182955]: 2026-01-30 09:50:24.433 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:50:24 np0005601978 nova_compute[182955]: 2026-01-30 09:50:24.434 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:50:25 np0005601978 podman[219018]: 2026-01-30 09:50:25.427670311 +0000 UTC m=+0.076267865 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 30 04:50:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:31.491 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:50:31 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:31.497 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:50:31 np0005601978 podman[219039]: 2026-01-30 09:50:31.608385325 +0000 UTC m=+0.090367504 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:50:35 np0005601978 podman[219062]: 2026-01-30 09:50:35.404263789 +0000 UTC m=+0.059708798 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 30 04:50:39 np0005601978 ovn_controller[95419]: 2026-01-30T09:50:39Z|00265|chassis|WARN|Dropped 1 log messages in last 31 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:50:39 np0005601978 ovn_controller[95419]: 2026-01-30T09:50:39Z|00266|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:50:39 np0005601978 podman[219082]: 2026-01-30 09:50:39.451972238 +0000 UTC m=+0.109114095 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 30 04:50:42 np0005601978 podman[219108]: 2026-01-30 09:50:42.408612409 +0000 UTC m=+0.066852840 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.574 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.575 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.576 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.576 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.576 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.577 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.641 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.641 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.642 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.642 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.829 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.830 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5966MB free_disk=73.3598403930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.830 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.830 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.925 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.925 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.966 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.990 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.992 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:50:49 np0005601978 nova_compute[182955]: 2026-01-30 09:50:49.993 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:50:50 np0005601978 nova_compute[182955]: 2026-01-30 09:50:50.597 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 17.55 sec#033[00m
Jan 30 04:50:54 np0005601978 podman[219132]: 2026-01-30 09:50:54.402331105 +0000 UTC m=+0.061242485 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9/ubi-minimal, version=9.7, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc.)
Jan 30 04:50:56 np0005601978 podman[219153]: 2026-01-30 09:50:56.409155005 +0000 UTC m=+0.073442907 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:50:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:57.369 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:50:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:57.369 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:50:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:50:57.370 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:51:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:51:01Z|00267|chassis|WARN|Dropped 1 log messages in last 22 seconds (most recently, 22 seconds ago) due to excessive rate
Jan 30 04:51:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:51:01Z|00268|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:51:02 np0005601978 podman[219173]: 2026-01-30 09:51:02.409805609 +0000 UTC m=+0.073591961 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:51:06 np0005601978 podman[219198]: 2026-01-30 09:51:06.378169021 +0000 UTC m=+0.043525518 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Jan 30 04:51:10 np0005601978 podman[219217]: 2026-01-30 09:51:10.444124438 +0000 UTC m=+0.102347092 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 30 04:51:11 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:51:11.986 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:51:11 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:51:11.987 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:51:11 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:51:11.988 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:51:11 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:51:11.989 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:51:13 np0005601978 podman[219243]: 2026-01-30 09:51:13.380001361 +0000 UTC m=+0.041408257 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:51:18 np0005601978 nova_compute[182955]: 2026-01-30 09:51:18.851 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:51:25 np0005601978 podman[219267]: 2026-01-30 09:51:25.385093149 +0000 UTC m=+0.049060961 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.7, architecture=x86_64, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, build-date=2026-01-22T05:09:47Z, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc.)
Jan 30 04:51:27 np0005601978 podman[219287]: 2026-01-30 09:51:27.394394638 +0000 UTC m=+0.052121474 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 30 04:51:33 np0005601978 podman[219308]: 2026-01-30 09:51:33.376273781 +0000 UTC m=+0.039482650 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:51:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:51:36.011 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:51:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:51:36.023 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:51:37 np0005601978 podman[219330]: 2026-01-30 09:51:37.385966818 +0000 UTC m=+0.045045694 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 30 04:51:41 np0005601978 ovn_controller[95419]: 2026-01-30T09:51:41Z|00269|chassis|WARN|Dropped 5 log messages in last 40 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:51:41 np0005601978 ovn_controller[95419]: 2026-01-30T09:51:41Z|00270|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:51:41 np0005601978 podman[219349]: 2026-01-30 09:51:41.422350015 +0000 UTC m=+0.087191558 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller)
Jan 30 04:51:44 np0005601978 podman[219375]: 2026-01-30 09:51:44.388435203 +0000 UTC m=+0.053027405 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:51:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:51:56 np0005601978 podman[219398]: 2026-01-30 09:51:56.406380932 +0000 UTC m=+0.060237599 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 30 04:51:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:51:57.371 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:51:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:51:57.372 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:51:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:51:57.372 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:51:58 np0005601978 podman[219420]: 2026-01-30 09:51:58.387226458 +0000 UTC m=+0.047321069 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:51:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:51:59.474 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:51:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:51:59.474 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:51:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:51:59.476 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:51:59 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:51:59.476 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:52:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:00.483 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:52:00 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:00.483 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:52:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:01Z|00271|chassis|WARN|Dropped 1 log messages in last 20 seconds (most recently, 20 seconds ago) due to excessive rate
Jan 30 04:52:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:01Z|00272|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:52:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:01.484 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:52:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:01.484 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:52:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:01.485 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:52:01 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:01.485 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:52:03 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:03Z|00273|reconnect|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: no response to inactivity probe after 60 seconds, disconnecting
Jan 30 04:52:03 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:03Z|00274|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped
Jan 30 04:52:03 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:03Z|00275|main|INFO|OVNSB commit failed, force recompute next time.
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.302 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.302 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.302 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.322 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.323 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.323 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.323 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.324 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.324 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.324 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.324 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.324 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.347 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.348 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.348 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.348 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.461 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.463 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5948MB free_disk=73.35982131958008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.463 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.463 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:03 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:03.489 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:52:03 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:03.490 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.568 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.568 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.629 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing inventories for resource provider 5912bad0-7860-4f37-8078-1db5720295f4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.680 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Updating ProviderTree inventory for provider 5912bad0-7860-4f37-8078-1db5720295f4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.680 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Updating inventory in ProviderTree for provider 5912bad0-7860-4f37-8078-1db5720295f4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.701 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing aggregate associations for resource provider 5912bad0-7860-4f37-8078-1db5720295f4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.721 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing trait associations for resource provider 5912bad0-7860-4f37-8078-1db5720295f4, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.741 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.760 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.762 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:52:03 np0005601978 nova_compute[182955]: 2026-01-30 09:52:03.762 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:04 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:04Z|00276|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 30 04:52:04 np0005601978 podman[219441]: 2026-01-30 09:52:04.394302736 +0000 UTC m=+0.054750088 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 30 04:52:05 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:05Z|00277|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out
Jan 30 04:52:05 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:05Z|00278|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect
Jan 30 04:52:05 np0005601978 nova_compute[182955]: 2026-01-30 09:52:05.349 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 44.75 sec#033[00m
Jan 30 04:52:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:05.491 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:52:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:05.491 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:52:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:05.492 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:52:05 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:05.492 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:52:06 np0005601978 nova_compute[182955]: 2026-01-30 09:52:06.340 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:07 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:07Z|00279|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 30 04:52:08 np0005601978 podman[219463]: 2026-01-30 09:52:08.387173497 +0000 UTC m=+0.050112537 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 30 04:52:09 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:09Z|00280|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out
Jan 30 04:52:09 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:09Z|00281|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect
Jan 30 04:52:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:09.499 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:52:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:09.500 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:52:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:09.508 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:52:09 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:09.510 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:52:12 np0005601978 podman[219482]: 2026-01-30 09:52:12.382225879 +0000 UTC m=+0.046001877 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 30 04:52:12 np0005601978 systemd[1]: 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089-3e9df801eb05be27.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:52:12 np0005601978 systemd[1]: 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089-3e9df801eb05be27.service: Failed with result 'exit-code'.
Jan 30 04:52:13 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:13Z|00282|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 30 04:52:13 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:13Z|00283|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 30 04:52:15 np0005601978 podman[219502]: 2026-01-30 09:52:15.375180135 +0000 UTC m=+0.040205089 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:52:15 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:15Z|00284|chassis|WARN|Dropped 5 log messages in last 13 seconds (most recently, 3 seconds ago) due to excessive rate
Jan 30 04:52:15 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:15Z|00285|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:52:20 np0005601978 nova_compute[182955]: 2026-01-30 09:52:20.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:20 np0005601978 nova_compute[182955]: 2026-01-30 09:52:20.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:21 np0005601978 nova_compute[182955]: 2026-01-30 09:52:21.435 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:23 np0005601978 nova_compute[182955]: 2026-01-30 09:52:23.429 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:25 np0005601978 nova_compute[182955]: 2026-01-30 09:52:25.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:52:25 np0005601978 nova_compute[182955]: 2026-01-30 09:52:25.434 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:52:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:26.483 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:52:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:26.484 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:52:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:26.484 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:52:26 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:26.484 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:52:27 np0005601978 podman[219527]: 2026-01-30 09:52:27.413472873 +0000 UTC m=+0.066779057 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, name=ubi9/ubi-minimal, vcs-type=git, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 30 04:52:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:27.491 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:52:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:27.492 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:52:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:28.493 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:52:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:28.494 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:52:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:28.499 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:52:28 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:28.500 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:52:29 np0005601978 podman[219549]: 2026-01-30 09:52:29.41014432 +0000 UTC m=+0.064581445 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:52:30 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:30.499 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:52:30 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:30.509 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:52:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:32.501 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:52:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:32.502 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:52:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:32.512 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:52:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:32.513 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:52:35 np0005601978 podman[219569]: 2026-01-30 09:52:35.402041133 +0000 UTC m=+0.060047126 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:52:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:36.506 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:52:36 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:36.518 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:52:39 np0005601978 podman[219592]: 2026-01-30 09:52:39.384154145 +0000 UTC m=+0.044370438 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 30 04:52:40 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:40.507 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:52:40 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:40.507 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:52:40 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:40.520 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:52:40 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:40.520 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:52:43 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:43Z|00286|chassis|WARN|Dropped 4 log messages in last 27 seconds (most recently, 27 seconds ago) due to excessive rate
Jan 30 04:52:43 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:43Z|00287|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:52:43 np0005601978 podman[219612]: 2026-01-30 09:52:43.434291453 +0000 UTC m=+0.094157745 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 30 04:52:46 np0005601978 podman[219638]: 2026-01-30 09:52:46.385081533 +0000 UTC m=+0.047110213 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:52:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:48.517 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:52:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:48.523 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:52:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:48.524 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:52:48 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:48.531 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:52:52 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:52.526 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:52:52 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:52.527 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:52:56 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:56Z|00288|chassis|WARN|Dropped 11 log messages in last 13 seconds (most recently, 6 seconds ago) due to excessive rate
Jan 30 04:52:56 np0005601978 ovn_controller[95419]: 2026-01-30T09:52:56Z|00289|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:52:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:57.373 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:52:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:57.373 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:52:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:52:57.373 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:52:58 np0005601978 podman[219662]: 2026-01-30 09:52:58.376125205 +0000 UTC m=+0.040228328 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 30 04:53:00 np0005601978 podman[219685]: 2026-01-30 09:53:00.39063023 +0000 UTC m=+0.050421123 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 30 04:53:06 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:06.162 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:53:06 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:06.162 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:53:06 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:06.164 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:53:06 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:06.164 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:53:06 np0005601978 podman[219706]: 2026-01-30 09:53:06.375039074 +0000 UTC m=+0.039920530 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:53:10 np0005601978 podman[219730]: 2026-01-30 09:53:10.400181869 +0000 UTC m=+0.058879969 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 30 04:53:14 np0005601978 ovn_controller[95419]: 2026-01-30T09:53:14Z|00290|chassis|WARN|Dropped 5 log messages in last 18 seconds (most recently, 13 seconds ago) due to excessive rate
Jan 30 04:53:14 np0005601978 ovn_controller[95419]: 2026-01-30T09:53:14Z|00291|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:53:14 np0005601978 podman[219749]: 2026-01-30 09:53:14.412101807 +0000 UTC m=+0.072302923 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 30 04:53:15 np0005601978 nova_compute[182955]: 2026-01-30 09:53:15.485 182959 ERROR oslo.messaging._drivers.impl_rabbit [-] [647d3e5a-90b9-4b94-a9df-3b503b41fd68] AMQP server on rabbitmq-cell1.openstack.svc:5671 is unreachable: . Trying again in 1 seconds.: socket.timeout#033[00m
Jan 30 04:53:16 np0005601978 nova_compute[182955]: 2026-01-30 09:53:16.528 182959 INFO oslo.messaging._drivers.impl_rabbit [-] [647d3e5a-90b9-4b94-a9df-3b503b41fd68] Reconnected to AMQP server on rabbitmq-cell1.openstack.svc:5671 via [amqp] client with port 51386.#033[00m
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.069 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:53:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:17.071 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:53:17 np0005601978 podman[219775]: 2026-01-30 09:53:17.399144274 +0000 UTC m=+0.063146192 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:53:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:21.272 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:53:21 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:21.275 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:53:29 np0005601978 podman[219800]: 2026-01-30 09:53:29.383352442 +0000 UTC m=+0.046644464 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1769056855, vcs-type=git, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.buildah.version=1.33.7, version=9.7, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Jan 30 04:53:31 np0005601978 podman[219821]: 2026-01-30 09:53:31.393676938 +0000 UTC m=+0.045669521 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 30 04:53:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:32.658 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:53:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:32.658 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:53:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:32.660 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:53:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:32.660 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:53:37 np0005601978 podman[219841]: 2026-01-30 09:53:37.398706118 +0000 UTC m=+0.064412062 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:53:41 np0005601978 podman[219866]: 2026-01-30 09:53:41.382163973 +0000 UTC m=+0.041832829 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:53:45 np0005601978 ovn_controller[95419]: 2026-01-30T09:53:45Z|00292|chassis|WARN|Dropped 2 log messages in last 31 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:53:45 np0005601978 ovn_controller[95419]: 2026-01-30T09:53:45Z|00293|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:53:45 np0005601978 podman[219885]: 2026-01-30 09:53:45.389304788 +0000 UTC m=+0.054869813 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:53:48 np0005601978 podman[219914]: 2026-01-30 09:53:48.377192925 +0000 UTC m=+0.042910494 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.757 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:53:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:53:56 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:56.688 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:53:56 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:56.690 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:53:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:57.374 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:53:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:57.375 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:53:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:53:57.375 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:00 np0005601978 podman[219939]: 2026-01-30 09:54:00.385300799 +0000 UTC m=+0.045284551 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1769056855, vcs-type=git, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible)
Jan 30 04:54:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:54:01Z|00294|chassis|WARN|Dropped 3 log messages in last 16 seconds (most recently, 5 seconds ago) due to excessive rate
Jan 30 04:54:01 np0005601978 ovn_controller[95419]: 2026-01-30T09:54:01Z|00295|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:54:02 np0005601978 podman[219961]: 2026-01-30 09:54:02.373738677 +0000 UTC m=+0.039420980 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 30 04:54:08 np0005601978 podman[219981]: 2026-01-30 09:54:08.399208931 +0000 UTC m=+0.057317002 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 30 04:54:12 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:54:12.210 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:54:12 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:54:12.210 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:54:12 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:54:12.211 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:54:12 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:54:12.212 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:54:12 np0005601978 podman[220006]: 2026-01-30 09:54:12.376164208 +0000 UTC m=+0.040508956 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 30 04:54:15 np0005601978 nova_compute[182955]: 2026-01-30 09:54:15.629 182959 ERROR oslo.messaging._drivers.impl_rabbit [-] [eef56048-6002-4368-b0c6-414ffc4d2f43] AMQP server on rabbitmq-cell1.openstack.svc:5671 is unreachable: <RecoverableConnectionError: unknown error>. Trying again in 1 seconds.: amqp.exceptions.RecoverableConnectionError: <RecoverableConnectionError: unknown error>#033[00m
Jan 30 04:54:16 np0005601978 ovn_controller[95419]: 2026-01-30T09:54:16Z|00296|chassis|WARN|Dropped 1 log messages in last 15 seconds (most recently, 15 seconds ago) due to excessive rate
Jan 30 04:54:16 np0005601978 ovn_controller[95419]: 2026-01-30T09:54:16Z|00297|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:54:16 np0005601978 podman[220025]: 2026-01-30 09:54:16.443320401 +0000 UTC m=+0.099924447 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 30 04:54:16 np0005601978 nova_compute[182955]: 2026-01-30 09:54:16.536 182959 WARNING nova.servicegroup.drivers.db [-] Lost connection to nova-conductor for reporting service status.: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID ed663b2236734bac9b38264dc3bde6d8#033[00m
Jan 30 04:54:16 np0005601978 nova_compute[182955]: 2026-01-30 09:54:16.538 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 111.19 sec#033[00m
Jan 30 04:54:16 np0005601978 nova_compute[182955]: 2026-01-30 09:54:16.685 182959 INFO oslo.messaging._drivers.impl_rabbit [-] [eef56048-6002-4368-b0c6-414ffc4d2f43] Reconnected to AMQP server on rabbitmq-cell1.openstack.svc:5671 via [amqp] client with port 43342.#033[00m
Jan 30 04:54:16 np0005601978 nova_compute[182955]: 2026-01-30 09:54:16.690 182959 INFO oslo_messaging._drivers.amqpdriver [-] No calling threads waiting for msg_id : ed663b2236734bac9b38264dc3bde6d8#033[00m
Jan 30 04:54:16 np0005601978 nova_compute[182955]: 2026-01-30 09:54:16.691 182959 INFO oslo_messaging._drivers.amqpdriver [-] No calling threads waiting for msg_id : ed663b2236734bac9b38264dc3bde6d8#033[00m
Jan 30 04:54:16 np0005601978 nova_compute[182955]: 2026-01-30 09:54:16.692 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:54:16 np0005601978 nova_compute[182955]: 2026-01-30 09:54:16.693 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:54:16 np0005601978 nova_compute[182955]: 2026-01-30 09:54:16.693 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 30 04:54:19 np0005601978 podman[220052]: 2026-01-30 09:54:19.407453395 +0000 UTC m=+0.063619683 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:54:29 np0005601978 nova_compute[182955]: 2026-01-30 09:54:29.192 182959 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: Too many heartbeats missed#033[00m
Jan 30 04:54:31 np0005601978 podman[220078]: 2026-01-30 09:54:31.382279239 +0000 UTC m=+0.046998484 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 30 04:54:33 np0005601978 podman[220099]: 2026-01-30 09:54:33.391358805 +0000 UTC m=+0.054989656 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute)
Jan 30 04:54:39 np0005601978 podman[220120]: 2026-01-30 09:54:39.394364965 +0000 UTC m=+0.053149331 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:54:43 np0005601978 podman[220144]: 2026-01-30 09:54:43.375713399 +0000 UTC m=+0.040963228 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 30 04:54:47 np0005601978 ovn_controller[95419]: 2026-01-30T09:54:47Z|00298|chassis|WARN|Dropped 1 log messages in last 31 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:54:47 np0005601978 ovn_controller[95419]: 2026-01-30T09:54:47Z|00299|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:54:47 np0005601978 podman[220163]: 2026-01-30 09:54:47.474685316 +0000 UTC m=+0.134241974 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 30 04:54:50 np0005601978 podman[220191]: 2026-01-30 09:54:50.39761211 +0000 UTC m=+0.053310774 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:54:56 np0005601978 ovn_controller[95419]: 2026-01-30T09:54:56Z|00300|reconnect|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: no response to inactivity probe after 60 seconds, disconnecting
Jan 30 04:54:56 np0005601978 ovn_controller[95419]: 2026-01-30T09:54:56Z|00301|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped
Jan 30 04:54:56 np0005601978 ovn_controller[95419]: 2026-01-30T09:54:56Z|00302|main|INFO|OVNSB commit failed, force recompute next time.
Jan 30 04:54:57 np0005601978 ovn_controller[95419]: 2026-01-30T09:54:57Z|00303|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 30 04:54:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:54:57.375 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:54:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:54:57.376 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:54:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:54:57.376 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:54:58 np0005601978 ovn_controller[95419]: 2026-01-30T09:54:58Z|00304|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out
Jan 30 04:54:58 np0005601978 ovn_controller[95419]: 2026-01-30T09:54:58Z|00305|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect
Jan 30 04:55:00 np0005601978 ovn_controller[95419]: 2026-01-30T09:55:00Z|00306|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 30 04:55:02 np0005601978 ovn_controller[95419]: 2026-01-30T09:55:02Z|00307|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out
Jan 30 04:55:02 np0005601978 ovn_controller[95419]: 2026-01-30T09:55:02Z|00308|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect
Jan 30 04:55:02 np0005601978 podman[220217]: 2026-01-30 09:55:02.379310187 +0000 UTC m=+0.040730821 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, io.openshift.expose-services=, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 30 04:55:04 np0005601978 podman[220239]: 2026-01-30 09:55:04.398646491 +0000 UTC m=+0.058900550 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Jan 30 04:55:06 np0005601978 ovn_controller[95419]: 2026-01-30T09:55:06Z|00309|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 30 04:55:10 np0005601978 ovn_controller[95419]: 2026-01-30T09:55:10Z|00310|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out
Jan 30 04:55:10 np0005601978 ovn_controller[95419]: 2026-01-30T09:55:10Z|00311|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging
Jan 30 04:55:10 np0005601978 podman[220259]: 2026-01-30 09:55:10.38041897 +0000 UTC m=+0.044477952 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:55:14 np0005601978 podman[220283]: 2026-01-30 09:55:14.382626518 +0000 UTC m=+0.042797451 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 30 04:55:18 np0005601978 podman[220302]: 2026-01-30 09:55:18.387717282 +0000 UTC m=+0.048085690 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=1, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:55:18 np0005601978 systemd[1]: 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089-3e9df801eb05be27.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:55:18 np0005601978 systemd[1]: 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089-3e9df801eb05be27.service: Failed with result 'exit-code'.
Jan 30 04:55:21 np0005601978 podman[220322]: 2026-01-30 09:55:21.408303576 +0000 UTC m=+0.066668627 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 30 04:55:33 np0005601978 podman[220347]: 2026-01-30 09:55:33.399113254 +0000 UTC m=+0.065021876 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, release=1769056855, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Jan 30 04:55:35 np0005601978 podman[220370]: 2026-01-30 09:55:35.435356983 +0000 UTC m=+0.092972600 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 30 04:55:41 np0005601978 podman[220390]: 2026-01-30 09:55:41.379500358 +0000 UTC m=+0.041046070 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:55:45 np0005601978 podman[220416]: 2026-01-30 09:55:45.398220113 +0000 UTC m=+0.063533581 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 30 04:55:49 np0005601978 podman[220435]: 2026-01-30 09:55:49.375157838 +0000 UTC m=+0.040326782 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=2, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Jan 30 04:55:49 np0005601978 systemd[1]: 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089-3e9df801eb05be27.service: Main process exited, code=exited, status=1/FAILURE
Jan 30 04:55:49 np0005601978 systemd[1]: 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089-3e9df801eb05be27.service: Failed with result 'exit-code'.
Jan 30 04:55:52 np0005601978 podman[220455]: 2026-01-30 09:55:52.379382349 +0000 UTC m=+0.044930893 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:55 np0005601978 ceilometer_agent_compute[192697]: 2026-01-30 09:55:55.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 30 04:55:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:55:57.377 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:55:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:55:57.378 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:55:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:55:57.378 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:03 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:03.386 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:56:03 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:03.448 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:56:04 np0005601978 podman[220480]: 2026-01-30 09:56:04.397228118 +0000 UTC m=+0.054108543 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, version=9.7, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 30 04:56:06 np0005601978 ovn_controller[95419]: 2026-01-30T09:56:06Z|00312|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 30 04:56:06 np0005601978 ovn_controller[95419]: 2026-01-30T09:56:06Z|00313|chassis|WARN|Dropped 3 log messages in last 79 seconds (most recently, 70 seconds ago) due to excessive rate
Jan 30 04:56:06 np0005601978 ovn_controller[95419]: 2026-01-30T09:56:06Z|00314|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:56:06 np0005601978 podman[220502]: 2026-01-30 09:56:06.384109418 +0000 UTC m=+0.048920529 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute)
Jan 30 04:56:10 np0005601978 nova_compute[182955]: 2026-01-30 09:56:10.357 182959 ERROR oslo.messaging._drivers.impl_rabbit [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] [647d3e5a-90b9-4b94-a9df-3b503b41fd68] AMQP server on rabbitmq-cell1.openstack.svc:5671 is unreachable: . Trying again in 1 seconds.: socket.timeout#033[00m
Jan 30 04:56:11 np0005601978 nova_compute[182955]: 2026-01-30 09:56:11.382 182959 INFO oslo.messaging._drivers.impl_rabbit [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] [647d3e5a-90b9-4b94-a9df-3b503b41fd68] Reconnected to AMQP server on rabbitmq-cell1.openstack.svc:5671 via [amqp] client with port 60140.#033[00m
Jan 30 04:56:11 np0005601978 nova_compute[182955]: 2026-01-30 09:56:11.387 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:12 np0005601978 podman[220522]: 2026-01-30 09:56:12.382426088 +0000 UTC m=+0.041387088 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 30 04:56:16 np0005601978 podman[220545]: 2026-01-30 09:56:16.373118267 +0000 UTC m=+0.037274299 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 30 04:56:16 np0005601978 nova_compute[182955]: 2026-01-30 09:56:16.543 182959 ERROR oslo.messaging._drivers.impl_rabbit [-] [34731e33-da8c-4268-9d71-432061e8bee1] AMQP server on rabbitmq-cell1.openstack.svc:5671 is unreachable: . Trying again in 1 seconds.: socket.timeout#033[00m
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.073 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, column=external_ids, values=({'neutron:ovn-metadata-id': 'cea1d6e4-cd7e-5766-b297-91c3a2d2e9e7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.transaction [-] Traceback (most recent call last):
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]:    txn.results.put(txn.do_commit())
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]:    command.run_idl(txn)
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]:    record = self.api.lookup(self.table, self.record)
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]:    return self._lookup(table, record)
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]:    row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]:  File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]:    raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: #033[00m
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (DbAddCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 50, in execute
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 120, in transaction
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/api.py", line 71, in __exit__
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 64, in commit
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 118, in run
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 92, in do_commit
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command     command.run_idl(txn)
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 180, in run_idl
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command     record = self.api.lookup(self.table, self.record)
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 183, in lookup
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command     return self._lookup(table, record)
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 234, in _lookup
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command     row = idlutils.row_by_value(self, rl.table, rl.column, record)
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Chassis_Private with name=9803b804-d88a-4443-b777-6ecddbb75ed8
Jan 30 04:56:17 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:17.074 104657 ERROR ovsdbapp.backend.ovs_idl.command #033[00m
Jan 30 04:56:17 np0005601978 nova_compute[182955]: 2026-01-30 09:56:17.566 182959 INFO oslo.messaging._drivers.impl_rabbit [-] [34731e33-da8c-4268-9d71-432061e8bee1] Reconnected to AMQP server on rabbitmq-cell1.openstack.svc:5671 via [amqp] client with port 43192.#033[00m
Jan 30 04:56:17 np0005601978 nova_compute[182955]: 2026-01-30 09:56:17.572 182959 INFO nova.servicegroup.drivers.db [-] Recovered from being unable to report status.#033[00m
Jan 30 04:56:17 np0005601978 nova_compute[182955]: 2026-01-30 09:56:17.573 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 111.03 sec#033[00m
Jan 30 04:56:20 np0005601978 nova_compute[182955]: 2026-01-30 09:56:20.109 182959 ERROR oslo.messaging._drivers.impl_rabbit [-] [eef56048-6002-4368-b0c6-414ffc4d2f43] AMQP server on rabbitmq-cell1.openstack.svc:5671 is unreachable: Too many heartbeats missed. Trying again in 1 seconds.: amqp.exceptions.ConnectionForced: Too many heartbeats missed#033[00m
Jan 30 04:56:20 np0005601978 ovn_controller[95419]: 2026-01-30T09:56:20Z|00315|chassis|WARN|Dropped 14 log messages in last 14 seconds (most recently, 6 seconds ago) due to excessive rate
Jan 30 04:56:20 np0005601978 ovn_controller[95419]: 2026-01-30T09:56:20Z|00316|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:56:20 np0005601978 podman[220565]: 2026-01-30 09:56:20.411663357 +0000 UTC m=+0.077180080 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 30 04:56:21 np0005601978 nova_compute[182955]: 2026-01-30 09:56:21.135 182959 INFO oslo.messaging._drivers.impl_rabbit [-] [eef56048-6002-4368-b0c6-414ffc4d2f43] Reconnected to AMQP server on rabbitmq-cell1.openstack.svc:5671 via [amqp] client with port 43200.#033[00m
Jan 30 04:56:21 np0005601978 nova_compute[182955]: 2026-01-30 09:56:21.137 182959 INFO oslo_messaging._drivers.amqpdriver [-] No calling threads waiting for msg_id : 2c0f66b39d58422b9fca3afa880497f1#033[00m
Jan 30 04:56:21 np0005601978 nova_compute[182955]: 2026-01-30 09:56:21.138 182959 INFO oslo_messaging._drivers.amqpdriver [-] No calling threads waiting for msg_id : 029850fe8e234e3a94881f62bd500016#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.139 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.139 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.161 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.162 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.162 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.190 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.190 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.191 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.191 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.191 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.192 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.192 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.192 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.192 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.215 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.215 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.216 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.216 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.330 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.330 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5962MB free_disk=73.35983657836914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.331 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.331 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.404 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.404 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.428 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.448 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.449 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:56:22 np0005601978 nova_compute[182955]: 2026-01-30 09:56:22.449 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:56:23 np0005601978 podman[220591]: 2026-01-30 09:56:23.397526446 +0000 UTC m=+0.052791132 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:56:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:32.068 104657 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '06:8b:2d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ad:7d:cb:97:dc'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 30 04:56:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:32.069 104657 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 30 04:56:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:32.069 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9803b804-d88a-4443-b777-6ecddbb75ed8, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 30 04:56:32 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:32.070 104657 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 30 04:56:32 np0005601978 nova_compute[182955]: 2026-01-30 09:56:32.372 182959 ERROR oslo.messaging._drivers.impl_rabbit [-] [1cffe5b1-6500-4c7f-a6b9-bb4d3ead000e] AMQP server on rabbitmq-cell1.openstack.svc:5671 is unreachable: <RecoverableConnectionError: unknown error>. Trying again in 1 seconds.: amqp.exceptions.RecoverableConnectionError: <RecoverableConnectionError: unknown error>#033[00m
Jan 30 04:56:32 np0005601978 ovn_controller[95419]: 2026-01-30T09:56:32Z|00317|chassis|WARN|Dropped 7 log messages in last 12 seconds (most recently, 0 seconds ago) due to excessive rate
Jan 30 04:56:32 np0005601978 ovn_controller[95419]: 2026-01-30T09:56:32Z|00318|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:56:33 np0005601978 nova_compute[182955]: 2026-01-30 09:56:33.408 182959 INFO oslo.messaging._drivers.impl_rabbit [-] [1cffe5b1-6500-4c7f-a6b9-bb4d3ead000e] Reconnected to AMQP server on rabbitmq-cell1.openstack.svc:5671 via [amqp] client with port 51642.#033[00m
Jan 30 04:56:35 np0005601978 podman[220615]: 2026-01-30 09:56:35.386220183 +0000 UTC m=+0.046123012 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.7, io.openshift.tags=minimal rhel9, release=1769056855, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, distribution-scope=public, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Jan 30 04:56:37 np0005601978 podman[220636]: 2026-01-30 09:56:37.379223212 +0000 UTC m=+0.044134494 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Jan 30 04:56:43 np0005601978 podman[220656]: 2026-01-30 09:56:43.388281538 +0000 UTC m=+0.050501377 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:56:44 np0005601978 nova_compute[182955]: 2026-01-30 09:56:44.198 182959 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: Too many heartbeats missed#033[00m
Jan 30 04:56:47 np0005601978 podman[220679]: 2026-01-30 09:56:47.42568193 +0000 UTC m=+0.079584088 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 30 04:56:51 np0005601978 ovn_controller[95419]: 2026-01-30T09:56:51Z|00319|chassis|WARN|Dropped 32 log messages in last 19 seconds (most recently, 8 seconds ago) due to excessive rate
Jan 30 04:56:51 np0005601978 ovn_controller[95419]: 2026-01-30T09:56:51Z|00320|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:56:51 np0005601978 podman[220698]: 2026-01-30 09:56:51.443397199 +0000 UTC m=+0.093635485 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 30 04:56:52 np0005601978 nova_compute[182955]: 2026-01-30 09:56:52.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:52 np0005601978 nova_compute[182955]: 2026-01-30 09:56:52.434 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:56:52 np0005601978 nova_compute[182955]: 2026-01-30 09:56:52.435 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:56:52 np0005601978 nova_compute[182955]: 2026-01-30 09:56:52.435 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:56:54 np0005601978 podman[220724]: 2026-01-30 09:56:54.39651653 +0000 UTC m=+0.052003204 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 30 04:56:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:57.378 104657 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:56:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:57.379 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:56:57 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:56:57.379 104657 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:57:06 np0005601978 ovn_controller[95419]: 2026-01-30T09:57:06Z|00321|chassis|WARN|Dropped 2 log messages in last 15 seconds (most recently, 5 seconds ago) due to excessive rate
Jan 30 04:57:06 np0005601978 ovn_controller[95419]: 2026-01-30T09:57:06Z|00322|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:57:06 np0005601978 podman[220748]: 2026-01-30 09:57:06.387150803 +0000 UTC m=+0.051391658 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1769056855, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z)
Jan 30 04:57:08 np0005601978 podman[220770]: 2026-01-30 09:57:08.37840439 +0000 UTC m=+0.042947666 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute)
Jan 30 04:57:14 np0005601978 podman[220791]: 2026-01-30 09:57:14.40214527 +0000 UTC m=+0.065497028 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 30 04:57:18 np0005601978 podman[220814]: 2026-01-30 09:57:18.39870689 +0000 UTC m=+0.063991921 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 30 04:57:22 np0005601978 ovn_controller[95419]: 2026-01-30T09:57:22Z|00323|chassis|WARN|Dropped 1 log messages in last 16 seconds (most recently, 16 seconds ago) due to excessive rate
Jan 30 04:57:22 np0005601978 ovn_controller[95419]: 2026-01-30T09:57:22Z|00324|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:57:22 np0005601978 podman[220834]: 2026-01-30 09:57:22.416525503 +0000 UTC m=+0.081353500 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 30 04:57:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:23.954 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:57:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:23.954 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: receive error: Transport endpoint is not connected#033[00m
Jan 30 04:57:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:23.956 105213 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:57:23 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:23.956 104657 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection dropped (Transport endpoint is not connected)#033[00m
Jan 30 04:57:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:24.964 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:57:24 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:24.965 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:57:25 np0005601978 podman[220860]: 2026-01-30 09:57:25.40853053 +0000 UTC m=+0.063181733 container health_status e257ee5ab5993b37a8ce63df2a2728e26d69a34084ec127e804eaf4aa807d901 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 30 04:57:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:25.966 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:57:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:25.966 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:57:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:25.967 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:57:25 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:25.967 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect#033[00m
Jan 30 04:57:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:27.972 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:57:27 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:27.973 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:57:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:29.975 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:57:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:29.976 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:57:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:29.976 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:57:29 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:29.976 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect#033[00m
Jan 30 04:57:33 np0005601978 nova_compute[182955]: 2026-01-30 09:57:33.027 182959 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 45.46 sec#033[00m
Jan 30 04:57:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:33.985 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:57:33 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:33.985 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.076 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.077 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.078 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.078 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.078 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.079 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.079 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.079 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.080 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.164 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.165 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.165 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.165 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.331 182959 WARNING nova.virt.libvirt.driver [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.332 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5961MB free_disk=73.35983657836914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.332 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.332 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.487 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.488 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.560 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing inventories for resource provider 5912bad0-7860-4f37-8078-1db5720295f4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.651 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Updating ProviderTree inventory for provider 5912bad0-7860-4f37-8078-1db5720295f4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.652 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Updating inventory in ProviderTree for provider 5912bad0-7860-4f37-8078-1db5720295f4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.667 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing aggregate associations for resource provider 5912bad0-7860-4f37-8078-1db5720295f4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.688 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Refreshing trait associations for resource provider 5912bad0-7860-4f37-8078-1db5720295f4, traits: COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.710 182959 DEBUG nova.compute.provider_tree [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5912bad0-7860-4f37-8078-1db5720295f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.869 182959 DEBUG nova.scheduler.client.report [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Inventory has not changed for provider 5912bad0-7860-4f37-8078-1db5720295f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.871 182959 DEBUG nova.compute.resource_tracker [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.871 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.871 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.871 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.872 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.872 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.872 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.872 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.873 182959 DEBUG oslo_concurrency.lockutils [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.893 182959 DEBUG nova.virt.libvirt.imagecache [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.894 182959 WARNING nova.virt.libvirt.imagecache [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.894 182959 INFO nova.virt.libvirt.imagecache [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Removable base files: /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.895 182959 INFO nova.virt.libvirt.imagecache [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/27f3756dd30074249f54b073a56d4c88beec31b4#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.895 182959 DEBUG nova.virt.libvirt.imagecache [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.895 182959 DEBUG nova.virt.libvirt.imagecache [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 30 04:57:35 np0005601978 nova_compute[182955]: 2026-01-30 09:57:35.895 182959 DEBUG nova.virt.libvirt.imagecache [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 30 04:57:37 np0005601978 podman[220884]: 2026-01-30 09:57:37.383331602 +0000 UTC m=+0.047690759 container health_status 95a6d2e94df8c67db6e378726cb21004219220cbd653240fceb8fb07b53f9f28 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 30 04:57:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:37.987 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:57:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:37.987 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:57:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:37.990 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out#033[00m
Jan 30 04:57:37 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:37.990 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging#033[00m
Jan 30 04:57:39 np0005601978 podman[220905]: 2026-01-30 09:57:39.371273338 +0000 UTC m=+0.037124854 container health_status 33bea565b521d5cfa58fe55de6b0bb87f6e67fcca3a1762b5a628a968b0c1ab6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, tcib_managed=true)
Jan 30 04:57:39 np0005601978 nova_compute[182955]: 2026-01-30 09:57:39.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:39 np0005601978 nova_compute[182955]: 2026-01-30 09:57:39.433 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:39 np0005601978 nova_compute[182955]: 2026-01-30 09:57:39.475 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:39 np0005601978 nova_compute[182955]: 2026-01-30 09:57:39.475 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 30 04:57:39 np0005601978 nova_compute[182955]: 2026-01-30 09:57:39.475 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 30 04:57:39 np0005601978 nova_compute[182955]: 2026-01-30 09:57:39.489 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 30 04:57:39 np0005601978 nova_compute[182955]: 2026-01-30 09:57:39.489 182959 DEBUG oslo_service.periodic_task [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 30 04:57:39 np0005601978 nova_compute[182955]: 2026-01-30 09:57:39.489 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 30 04:57:39 np0005601978 nova_compute[182955]: 2026-01-30 09:57:39.508 182959 DEBUG nova.compute.manager [None req-19f9cbb9-438e-41c1-bc4f-74fd453004a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 30 04:57:44 np0005601978 systemd-logind[793]: New session 28 of user zuul.
Jan 30 04:57:44 np0005601978 systemd[1]: Started Session 28 of User zuul.
Jan 30 04:57:44 np0005601978 podman[220952]: 2026-01-30 09:57:44.523113491 +0000 UTC m=+0.051061061 container health_status a6a53ab36fbd57321d118e56ad1fb3ae1cc7eb9b1155827d023f0923f5b132dc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 30 04:57:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:46.009 105213 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:57:46 np0005601978 ovn_metadata_agent[104652]: 2026-01-30 09:57:46.010 104657 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 30 04:57:49 np0005601978 podman[221105]: 2026-01-30 09:57:49.007047787 +0000 UTC m=+0.047641078 container health_status 6f37c6681f81119f9f096b68774e82a6cc530aaa0ea394128542748f6849a959 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 30 04:57:49 np0005601978 ovs-vsctl[221151]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 30 04:57:50 np0005601978 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 220953 (sos)
Jan 30 04:57:50 np0005601978 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 30 04:57:50 np0005601978 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 30 04:57:50 np0005601978 virtqemud[185722]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 30 04:57:50 np0005601978 virtqemud[185722]: hostname: compute-1
Jan 30 04:57:50 np0005601978 virtqemud[185722]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 30 04:57:50 np0005601978 virtqemud[185722]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 30 04:57:50 np0005601978 virtqemud[185722]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 30 04:57:53 np0005601978 systemd[1]: Starting Hostname Service...
Jan 30 04:57:53 np0005601978 ovn_controller[95419]: 2026-01-30T09:57:53Z|00325|chassis|WARN|Dropped 1 log messages in last 31 seconds (most recently, 31 seconds ago) due to excessive rate
Jan 30 04:57:53 np0005601978 ovn_controller[95419]: 2026-01-30T09:57:53Z|00326|chassis|WARN|'d14b9ab5-bf6e-4142-ad45-b863645e483d' already has encap ip '172.19.0.101' and type 'geneve', cannot duplicate on '9803b804-d88a-4443-b777-6ecddbb75ed8'
Jan 30 04:57:53 np0005601978 systemd[1]: Started Hostname Service.
Jan 30 04:57:53 np0005601978 podman[221661]: 2026-01-30 09:57:53.433397109 +0000 UTC m=+0.114513659 container health_status 4573a3545ad403d3884a7729587e1f8bf28343671d4edf9f74c9b8af18d1a089 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7135d48b44e59f81b3ccd8a4e8d9f3efa994906c0deae2f640c2d1552bac7007-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5-f175b4f8c0cf558e0af73e5ca1d17bd98e731f73c7b021ba46a424e5145729f5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
